SD Data Load (Deliveries Too High In Number)

Hi All,
I am loading data for application component 11, 12, & 13.
Filling up of set up table for application component 12 is taking very long time.
So i am thinking of taking alternative path.
Say; The company has following Sales Organization -- 1100, 1200, 1300
Steps:
1. Clear init from RSA7, delete data from setup table in LBWG, clear q in LBWQ.
2. Go to T code OLI8BW( Deliveries) enter S. Org. as 1100.
3. Trigger INIT with data after the setup table is filled for S.Org 1100.
4. Fill setup table again for other sales org 1200, 1300 and trigger repair full for the respective S.Org 1200, 1300.
Please let me know if the steps above are correct, if not please mention the correct steps.
Thanks..
Regards
Madhusudan

Hi,
Assuming as your setting LO data load flow for application 12 for the first time.
So your data source won't be exist at RSA7/LBWQ.
first of all need to lock source system/related t codes.
1. Delete(LBWG) and fill(oLI8BW) the setup tables as you need
       a. selections for sales org  - 1100.
2. trigger info pack with data , so sales org 1100 data will be moved to PSA and delta pointer will set to your data source, data source will visible at RSA7. PSA data you can load to further data targets.
3. Fill setup tables for Slaes org - 1200 and 1300
4. Run repair full request with selections on Sales org - 1200 and 1300. Load same data from psa to further targets.
5. Select V3 update method and schedule V3 job run to load delta records from source/SM13/LBWQ to RSA7.
6. unlock source system.
7. once you see records at RSA7, you can load delta records into bw by using delta info pack.
Thanks

Similar Messages

  • OWB 10g - The time taken for data load is too high

    I am loading data on the test datawarehouse server. The time taken for loading data is very high. The size of data is around 7 GB (size of flat files on the OS).
    The time it takes to load the same amount of data on the production server from the staging area to the presentation area(datawarehouse) is close to 8 hours maximum.
    But, in the test environment the time taken to execute one mapping (containing 300,000 records)is itself 8 hours.
    The version of Oracle database on both the test and production servers is the same i.e., Oracle 9i.
    The configuration of the production server is : 4 Pentium III processors (2.7 GHz each), 2 GB RAM, Windows 2000 Advanced Server, 8 kilobyte primary memory cache, 512 kilobyte secondary memory cache, 440.05 Gigabytes Usable Hard Drive Capacity, 73.06 Gigabytes Hard Drive Free Space
    The configuration of the test server is : 4 Pentium III processors (2.4 GHz each), 1 GB RAM, Windows 2000 Advanced Server, 8 kilobyte primary memory cache,
    512 kilobyte secondary memory cache, 144.96 Gigabytes Usable Hard Drive Capacity, 5.22 Gigabytes Hard Drive Free Space.
    Can you guys please help me to detect the the possible causes of such erratic behaviour of the OWB 10g Tool.
    Thanks & Best Regards,
    Harshad Borgaonkar
    PwC

    Hello Harshad,
    2 GB of RAM doesn't seem to be very much to me. I guess your bottleneck is I/O. You've got to investigate this (keep an eye on long running processes). You didn't say very much about your target database design. Do you have a lot of indexes on the target tables and if so have you tried to drop them before loading? Do your OWB mappings require a lot of lookups (then apropriate indexes on the lookup table are very useful)? Do you use external tables? Are you talking about loading dimension or fact tables or both? You've got to supply some more information so that we can help you better.
    Regards,
    Jörg

  • Hashtable insert failed. Load factor too high : Dot net 4.5

    I'm getting an exception “System.InvalidOperationException: Hashtable insert failed. Load factor too high.”
    My application is developed in VS2010 with 4.0 framework and it is working fine in our development environment (4.0 version ) (for 4.0 framework I'm able to find the hotfix and I've installed it in my development environment). In production environment
    framework is upgraded to 4.5 version and we often getting the mentioned exception.
    Application developed : windws 7 - 32 bit OS / vb.net 4.0
    Development environment (hotfix installed) :  windows server 2008 - 64 bit  / dotnet FW 4.0
    Prod Environment : windows server 2008 - 64 bit / dotnet FW 4.5 
    Kindly help me resolving this issue.
    Thanks.

    Hello,
    For this issue, we will focus on your new posted thread:
    https://social.msdn.microsoft.com/Forums/vstudio/en-US/a3feddfd-ff21-438c-9704-661375f096af/is-that-the-hot-fix-kb2803754-included-in-net-framework-451-version-4550938?forum=clr#a3feddfd-ff21-438c-9704-661375f096af
    >>Can you conform that the hot fix KB2803754 available in the version
    4.5.50938 (.NET Framework 4.5.1) ?
    You could do a test in the link above to check if the .NET 4.5.1 contians this KB(currently I do not have  an proper environment to test it).
    Regards.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Data loading time too long

    Hi there,
    I have two Infopackages - one for year 2003 and one for year 2004. For year 2003 I use interval 200301-200312 and for year 2004 I use interval 200401-200412 as data selection. Then I started to load data from R/3. For year 2003 we have 4.9 Millions records and took 6 hours to load and for year 2004 we have 5.5 Millions records and took 46 hours to load. The interesting thing was that when I tried to use InfoPackage 2003 and put the interval 200401-200412 as data selection for loading 2004 data, it took only 7.x hours! Why? Is something wrong with the InfoPackage for 2004? Any idea and suggestion is greatly appreciated.
    Weidong

    Hi Weidong,
    Check the processing type in both the infopackages. May be one of the infopackages has "PSA and then into Data Target" and the other infopackage has "PSA and Data Target in parallel".
    Bye
    Dinesh

  • Which CPU heatsink/fan for Z77A-GD65? - if 90 C at 100% load is too high?

    With stock Intel heat sink and keeping the CPU (i7 3770) load at 100% for hours, the CPU  temp hovers around 89-90 C.   My case has 2 x 120 mm fans in the front and 2 x 120 mm  fans on the back of the case, .  With the room temperature of 20-22 C, the typical idle CPU Temp is 30-32 C.
    Two questions please.
    1) Is keeping the CPU temperature at 90 C all day  "safe"? 
    2) What after market  heat-sink would would recommend for this MB, if the temperature should be dropped few degrees. 
    My observation for Z77A-GD65: There isn't much real-estate around the CPU for a large coolers and furthermore, my case cannot fit one of those tall  cooler heat sinks with 120 mm fan on the fins.
    I don't overclock, but run multiple CPU intensive simulations.
    i7 3770,
    Z77A-D65 MB,
    Corsair 750TX PSU,
    Corsair Veangeance LP 4 x 8GB RAM ( CML16GX3M2A1600C10)
    using internal INTEL graphics and I have 5 x 1TB HDD's of various brand.

    UPDATE: 
    Thank you again for the good suggestions.
    I installed Arctic Freezer 13 CO cooler and the temp dropped about 25 C from 90C to 65C with 100% CPU load (my simulation program) for nearly 10 hours.
    Tested on Prime95 => 71 C max after 10 hours.  Before the upgrade, I dared not run prime95 because the temperature would quickly shoot up above 90C.
    IntelBurnTest 2.54 Max Stress => 80 C max
    The 92mm cooler fan clears  barely above the low profile RAM.  It may not fit on high profile RAM( those with high heat spread)  Max RPM is about 2100.  It's very quiet. I cannot  tell apart the cooler fan noise over the case fan noise.  After trying to get optimal fan speed using MB BIOS and MSI Control Center, I decided to turn off smart fan control and leave the fan speed at 100%.  This "CO: Continuous Operation" fan is supposed to last a long time anyway.

  • Can subtitles cause data rate too high errors?

    I've been building a DVD in Encore CS5, using Sorenson as the transcoder.  I've built the disc many times with no issues.  Now I've added a Spanish subtitle track, and I am suddenly getting errors that the data rate is too high.  Can the subtitles cause this?
    These are my Sorenson settings.  Ultimately this will be a commercially pressed disc.  Do I need to adjust, and if so, how?
    Target 8400 kbps
    Min: 53% (4452)
    Max: 115% (9660)
    Total data rate: 8624
    2-pass VBR, 720x480.
    Audio: ATSC A/52
    Data rate: 224
    Sample: 48K
    Stereo, 16

    The question will be in part what the actual datarate is (and most particularly, datarate spikes). If we were using burned disks, I would worry. Since players don't have to play burned disks, they are less likely to be uniform in handling datarate variations. You are replicating; correct? So a player that fails to play correctly, is failing what it is supposed to do. As long as you are in spec, you should be okay.
    The other way to answer this: no, I would not trust Encore failing to throw an error as a sufficient guarantee. Does the replicator test for datarate?

  • Log Issue in HFM data load

    Hi,
    I'm new to Oracle data Integrator.
    I have an issue in log file name. I'm loading data into Hyperion Financial Management through ODI. In the Interface, when we select the IKM SQL to HFM data, we have an option of log file enabled. I made it true and gave the log file name as 'HFM_dataload.log'. After executing the interface when I navigate in to that log folder and view the log file, that file is blank. Also a new file 'HFM_dataloadHFM6064992926974374087.log' is created and the log details are displayed in it. Since I have to automate the process of picking up the everyday log file,
    * I need the log details to be displayed in the specified log name i.e. 'HFM_dataload.log
    Also I was not able to perform any action (copy that newly generated log file into another or send that file in mail) on that log file, since I'm not able to predict the numbers generated along with the specified log file name.
    Kindly help me to overcome this issue.
    Thanks in advance.
    Edited by: user13754156 on Jun 27, 2011 5:08 AM
    Edited by: user13754156 on Jun 27, 2011 5:09 AM

    Thanks a lot for idea.
    I am wonder in HFM data loads. In ODI operator they are showing warning symbol though few records got rejected instead Error. Is it possible to make it fail if one or more records got rejected.
    I have experience with Essbase data loads. if it reaches specified number of records operator will get fail.
    Please guide me if i am missing something.
    Regards,
    PrakashV

  • Automate data loads from SPM UI

    Hi Experts
    We have recently implemented SPM, and are doing Data Loads Manually.
    Is there a way of doing automatic data loads and release, and we only get involved if there is a warning or error ?
    Please guide
    Regards
    Pankaj

    Hi Pankaj,
    It is possible to automate the data loads into SPM.
    A number of customers have customized this into their solution as part of the implementation.
    We are planning to deliver OOTB functionality that allows for data load automation. It is currently being validated.
    Please email me directly [email protected] so we can have a call  to discuss the other questions that you have posted.
    Kind regads,
    Michael

  • PGC...data rate too high

    Hallo,
    message
    nunew33, "Mpeg not valid error message" #4, 31 Jan 2006 3:29 pm describes a certain error message. The user had problems with an imported MPEG movie.
    Now I receive the same message, but the MPEG that is causing the problem is created by Encore DVD itself!?
    I am working with the german version, but here is a rough translation of the message:
    "PGC 'Weitere Bilder' has an error at 00:36:42:07.
    The data rate of this file is too high for DVD. You must replace the file with one of a lower data rate. - PGC Info: Name = Weitere Bilder, Ref = SApgc, Time = 00:36:42:07"
    My test project has two menus and a slide show with approx. 25 slides and blending as transition. The menus are ok, I verified that before.
    First I thought it was a problem with the audio I use in the slide show. Because I am still in the state of learning how to use the application, I use some test data. The audio tracks are MP3s. I learned already that it is better to convert the MP3s to WAV files with certain properties.
    I did that, but still the DVD generation was not successful.
    Then I deleted all slides from the slide show but the first. Now the generation worked!? As far as a single slide (an image file) can not have a bitrate per second, and there was no sound any more, and as far as the error message appears AFTER the slide shows are generated, while Encore DVD is importing video and audio just before the burning process, I think that the MPEG that is showing the slide show is the problem.
    But this MPEG is created by Encore DVD itself. Can Encore DVD create Data that is not compliant to the DVD specs?
    The last two days I had to find out the cause for a "general error". Eventually I found out that image names must not be too long. Now there is something else, and I still have to just waste time for finding solutions for apparent bugs in Encore DVD. Why doesn't the project check find and tell me such problems? Problem is that the errors appear at the end of the generation process, so I always have to wait for - in my case - approx. 30 minutes.
    If the project check would have told me before that there are files with file names that are too long, I wouldn't have had to search or this for two days.
    Now I get this PGC error (what is PGC by the way?), and still have no clue, cause again the project check didn't mention anything.
    Any help would be greatly appreciated.
    Regards,
    Christian Kirchhoff

    Hallo,
    thanks, Ruud and Jeff, for your comments.
    The images are all scans of ancient paintings. And they are all rather dark. They are not "optimized", meaning they are JPGs right now (RGB), and they are bigger then the resolution for PAL 3:4 would require. I just found out that if I choose "None" as scaling, there is no error, and the generation of the DVD is much, much faster.
    A DVD with a slide show containing two slides and a 4 second transition takes about 3 minutes to generate when the scaling is set to something other than "None". Without scaling it takes approx. 14 seconds. The resulting movies size is the same (5,35 MB).
    I wonder why the time differs so much. Obviously the images have to be scaled to the target size. But it seems that the images are not scaled only once, that those scaled versions of the source images are cached, and those cached versions are used to generate then blend effect, but for every frame the source images seem to be scaled again.
    So I presume that the scaling - unfortunately - has an effect on the resulting movie, too, and thus influences the success of the process of DVD generation.
    basic situation:
    good image > 4 secs blend > bad image => error
    variations:
    other blend times don't cause an error:
    good image > 2 secs blend > bad image => success
    good image > 8 secs blend > bad image => success
    other transitions cause an error, too:
    good image > 4 secs fade to black > bad image => error
    good image > 4 secs page turn > bad image => error
    changing the image order prevents the error:
    bad image > 4 secs blend > good image => success
    changing the format of the bad image to TIFF doesn't prevent the error.
    changing colors/brightness of the bad image: a drastic change prevents the error. I adjusted the histogram and made everything much lighter.
    Just a gamma correction with values between 1.2 and 2.0 didn't help.
    changing the image size prevents the error. I decreased the size. The resulting image was still bigger than the monitor area, thus it still had to be scaled a bit by Encore DVD, but with this smaller version the error didn't occur. The original image is approx. 2000 px x 1400 px. Decreasing the size by 50% helped. Less scaling (I tried 90%, 80%, 70% and 60%, too) didn't help.
    using a slightly blurred version (gaussian blur, 2 px, in Photoshop CS) of the bad image prevents the error.
    My guess is that the error depends on rather subtle image properties. The blur doesn't change the images average brightness, the balance of colors or the size of the image, but still the error was gone afterwards.
    The problem is that I will work with slide shows that contain more images than two. It would be too time consuming to try to generate the DVD over and over again, look at which slide an error occurs, change that slide, and then generate again. Even the testing I am doing right now already "ate" a couple of days of my working time.
    Only thing I can do is to use a two image slide show and test image couple after image couple. If n is the number of images, I will spend (n - 1) times 3 minutes (which is the average time to create a two slides slide how with a blend). But of course I will try to prepare the images and make them as big as the monitor resolution, so Encore DVD doesn't have to scale the images any more. That'll make the whole generation process much shorter.
    If I use JPGs or TIFFs, the pixel aspect ratio is not preserved when the image is imported. I scaled one of the images in Photoshop, using a modified menu file that was installed with Encore DVD, because it already has the correct size for PAL, the pixel aspect ratio and the guides for the save areas. I saved the image as TIFF and as PSD and imported both into Encore DVD. The TIFF is rendered with a 1:1 pixel aspect ratio and NOT with the D1/DV PAL aspect ration that is stored in the TIFF. Thus the image gets narrowed and isn't displayed the way I wanted it any more. Only the PSD looks correct. But I think I saw this already in another thread...
    I cannot really understand why the MPEG encoding engine would produce bit rates that are illegal and that are not accepted afterwards, when Encore DVD is putting together all the stuff. Why is the MPEG encoding engine itself not throwing an error during the encoding process? This would save the developer so much time. Instead they have to wait until the end, thinking everything went right, and find out then that there was a problem.
    Still, if sometime somebody finds out more about the whole matter I would be glad about further explanations.
    Best regards,
    Christian

  • Data Load : Number of records count

    Hi Experts,
              I want to document number of records transferred to BW during an infopackage execution.
              I want to automate the process by running a report in background which will fetch a data from SAP tables about number of records been transfered by all my InfoPackage .
    I would like to know how should I proceed with.
             I want to know some System tables which contains same data as that of RSMO transaction displays to us.
    Kindly help with valuable replies.

    HI,
    inorder to get the record counts report you need to create a report based on below tables
    rsseldone, rsreqdone, rsldpiot, rsmonfact
    Check the below link which explain in detail with the report code as well.
    [Data load Quick Stats|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/90215bba-9a46-2a10-07a7-c14e97bdb764]
    This doc also explains how to trigger a mail with the details to all.
    Regards
    KP

  • How do you fix error message "data rate for this file is too high for DVD.  You must replace this file with one of a lower data rate".

    When trying to burn a DVD it will go through the encoding step and at 98% we see the message 'data rate for this file is too high for DVD.  You must replace this file with one of a lower data rate".  We need help to correct this so we can complete burning to DVD. 

    What did you export from Premiere?
    Did you use the MPEG2-DVD preset... and did you make any changes to the preset?
    CS5-thru-CC PPro/Encore tutorial list http://forums.adobe.com/thread/1448923 may help

  • SPM Data Loads : Less number of records getting loaded in the Invoice Inbound DSO

    Dear Experts,
    We are working on a project, where data of different NON SAP Source Systems is being loaded into SPM, via Flat File Loads. We came across a very weird situation.
    For other Master and Transaction Data objects, it worked fine, but when we loaded Invoice File, less number of records are getting loaded in the Inbound DSO. The Invoice File contained 80000 records, but the inbound DSO has 78500 records only. We are losing out on 1500 Records.
    We are unable to figure out, as to which 1500 records are we missing out on. We couldn't find any logs, in the Inbound Invoice DSO. We are unable to find out if the records are erroneous, or there is any issue with something else. Is there a way to analyze the situation / Inbound invoice DSO.
    If there is any issue with the Outbound DSO or Cube, We know that it is possible to check the issue with the Data Load request, but for the Inbound DSO, we are not aware, as to which the way to analyze the issue, and why Inbound DSO is taking less records.
    Regards
    Pankaj

    Hi,
    Yes, It might be happen in DSO, because the data records have the simantic keys, so in Keyfileds selection you might have less no of records.
    If you have any rountines check the code(If any condetion for filtering the records).
    Regards.

  • PGC has an error--data rate of this file is too high for DVD

    Getting one of those seemingly elusive PGC errors, though mine seems to be different from many of the ones listed here. Mine is telling me that the data rate of my file is too high for DVD. Only problem is, the file it's telling me has a datarate that is too high, is a slideshow which Encore has built using imported jpg files. I got the message, tried going into the slideshow and deleting the photo at the particular spot in the timeline where it said it had the problem, now getting the same message again with a different timecode spot in the same slideshow. The pictures are fairly big, but I assumed that Encore would automatically resize them to fit an NTSC DVD timeline. Do I need to open all the pictures in Photoshop and scale them down to 720x480 before I begin with the slideshows?

    With those efforts, regarding the RAM, it would *seem* that physical memory was not the problem.
    I'd look to how Windows is managing both the RAM addresses and also its Virtual Memory. To the former, I've seen programs/Processes that lock certain memory addresses upon launch (may be in startup), and do not report this to Windows accurately. Along those lines, you might want to use Task Manager to see what Processes are running from startup on your machine. I'll bet that you've got some that are not necessary, even if IT is doing a good job with the system setup. One can use MSCONFIG to do a trial of the system, without some of these.
    I also use a little program, EndItAll2 for eliminating all non-necessary programs and Processes, when doing editing. It's freeware, has a tiny footprint and usually does a perfect job of surveying your running programs and Processes, to shut them down. You can also modify its list, incase it wants to shut down something that IS necessary. I always Exit from my AV, spyware, popup-blocker, etc., as these progams will lie to EndItAll2 and say that they ARE necessary, as part of their job. Just close 'em out in the Tasktray, then run EndItAll2. Obviously, you'll need to do this with the approval of IT, but NLE machines need all available resources.
    Now, to the Virtual Memory. It is possible that Windows is not doing a good job of managing a dynamic Page File. Usually, it does, but many find there is greater stability with a fixed size at about 1.5 to 2.5x the physical RAM. I use the upper end with great results. A static Page File also makes defragmenting the HDD a bit easier too. I also have my Page File split over two physical HDD's. Some find locating to, say D:\ works best. For whatever reason, my XP-Pro SP3 demanded that I have it on C:\, or split between C:\ and D:\. Same OS on my 3 HDD laptop was cool having it on D:\ only. Go figure.
    These are just some thoughts.
    Glad that you got part of it solved and good luck with the next part. Since this seems to affect both PrPro and En, sounds system related.
    Hunt
    PS some IT techs love to add all sorts of monitors to the computers, especially if networkded. These are not usually bad, but are usually out of the mainstream, in that most users will never have most of these. You might want to ask about any monitors. Also, are you the only person with an NLE computer under the IT department? In major business offices, this often happens. Most IT folk do not have much, if any, experience with graphics, or NLE workstations. They spend their days servicing database, word processing and spreadsheet boxes.

  • Number Ranges consideration while Data Load

    Hello team
    we are upgrading from 4.6 to ecc 6.0. abaper is gonna try test load..I already configured number ranges in new system with respect to what is in 4.6. should the number ranges be deleted before the data load tesing? How will the system react if number ranges are already there?
    Thanks

    If u r number range series are not Changes than there will be no issues
    As u will mention the seiries of No Ranges in PA04 and set that series no as default value in NUMKR
    in case if u miss any of the above things in customisation than we may get of mismatch of No
    say u have a series with 100  200 but in the number ranges u given 200 300 than it may leads to trouble
    i never come across this situtation but this is just an INFO

  • Another Data Rate Too High Error Question

    Hi all,
    I have seen quite a few posts about pgc errors, but most of them deal with slideshows and not video; and I can't find anything with the REF=KApgc error.
    I'm just putting all of my home movies onto an authored DVD, not rocket science.  I'm using the 'Film Submenu' preset with only 5 main movies, things like Easter, Birthday, etc.  My transition for each button is a slate (for lack of a better word) that says exactly what it is and when it happened, before each movie plays.
    After I had everything set up, I realized I didn't have an ability to "play all" that would include these slates.  Simple solution, make a 'play all' button that links to a timeline that has everything in it - the slate, then the movie, next slate, next movie and so on.  (When I say movie - they are all .mov files rendered out from After Effects, even the slates)
    My transcode settings are NTSC DV High Quality 7mb VBR 2 Pass and the "maximum quality" box is checked.
    When I run the 'Check Disc' all is fine.  Preview the disc, all is fine.
    Build Disc: Hours of Transcoding then the error: Data Rate Too High at *timecode* REF=KApgc.
    When this movie is by itself with a button click, it's fine.  It's only in the "play all" timeline that it produces the error.
    If the data rate isn't too high in one place, why is it too high in another?  It's the same thing, just a longer timeline.
    Please Help!
    Thanks.

    Thanks Jeff,
    I feel kind of silly for not thinking of that as a solution to my play all dilemma.  That's why you're the expert I guess.
    A better solution would be to create a playlist of all of your other timelines; that way, only one copy of each gets burned to disc.
    Hopefully the "data rate" error will go away with these changes.
    -Jeff
    Can I place my slates that are just assets (.mov) in the playlist or can I only put timelines in there?
    Thanks again,
    -Jim

Maybe you are looking for

  • Itunes Version 11.1.4.62 changed my music folders

    3 or 4 years ago we purchased an IPOD I downloaded itunes onto a Dell computer with Windows XP operating system  ---  it was version 10.2.2  Just recently purchased a new computer with Windows 8 operating system.  Downloaded itunes on it – version 11

  • URGENT: How to find out the Tables for Routines  in BW.

    Hi BW Gurus, How to find out the Tables for Routines  in BW. I have this routine id <b>45XFAEI7LKIFIRDUKQG127YWW</b> and it is in inactive state and i want to activate it. thanks in advance, points will be rewarded. Regards, Maruthi.

  • BAPI_CTRACDOCUMENT_TRANSFER in FI-CA module

    Hi All, Did anybody work on BAPI-BAPI_CTRACDOCUMENT_TRANSFER in FI-CA module. Actually I am filling all the parameters exactly but not able post the document.Its giving the error like 'Internal error: Lock missing for change / clear document item' An

  • IDoc distribution

    Hi, I have one query while dealing with ALE configuration. Suppose I have to distribute IDoc in multiple system i.e. one sending say client 100 & receiver say client 200, 500, 700. How to deal if we have to distribute Idoc's to multiple systems. Ples

  • My screen doesnt work and is blank but  i need to do a backup but the phone has a passcode but i cant type it in . what can i do ?

    i woke u this  morning and my phone is blank its working but the screen isnt showing anything i need to do a backup before i take it into a store but because there is a passcode on the phone itunes need me to enter that onto the phone before i can ca