GI,GR and yield

I would like to know which table stores the qty and amount of goods issue and goods receipt to process order?
also I fcan't explain SAP expected yield variance , if production quantity is 40, actual goods delivered is 38, then expected yield var is -2. but for non-DLV order(REL order), delivered=0, production quantity=40, why expected yield = 0?

For this
You can find the GR  & GI price  value in  MSEG-  DMBTR, No need to take the data from Costing view, because, this is value while posting the material for that quantity
Check in COOISPI  list confirmation , You can find the confirmed qty in that
also If can't explain SAP expected yield variance , if production quantity is 40, actual goods delivered is 38, then expected yield var is -2. but for non-DLV order(REL order), delivered=0, production quantity=40, why expected yield = 0?
You Have confirmed for 40 but Gr is happened for 38 that why the variance is 2-
for non delivery order you haven't confirmed none that  there is no variance thats why zero
Instead of COiO, use COOISPI, I feel COIO is a  old version Report
Edited by: Sundaresan . E. V on Nov 1, 2010 7:42 PM

Similar Messages

  • Topic about wait(),join(),sleep() and yield()

    Hi there,
    I'm headache with the implementation of wait()��join()�� sleep() yield(). Can anybody tell me:
    1. what's the difference between wait() and join(), I can't see it.
    2. As I know if thread A waits on thread B then A will release the lock. How about join, or slepp, or yield?
    3. Why use notify()? As I see if thread A waits on thread B, when B completed without calling notify(), the A will continue its job!!
    thanks thanks thanks

    1. Object.wait() will make a thread wait until Object.notify() is called on the object. Thread.join() will make the running thread to wait until the thread object finish to execute.
    2. .wait() does not wait on thread, it wait on objects. .sleep(long timeout) is about the same as .wait(long timeout), except that it need to be awaken with a .interrupt, not .notify. Also you cannot awaken more than one thread with a .interrupt, while a call to .notifyAll() will awaken all threads waiting on that object. .yield() suspend the current thread only to give a chance to others thread to execute and the initial thread will continue ot execute only a hort time after the .yield().
    3. See #1, #2 and think a bit.
    Regards

  • I-pod Read do not Disconnet and Yield sign

    I found a Ipod Nano 2 and i hook it up to my computer and all it said is do not disconnect what do i need to do to use this ipod or can i use it at all please help me i want to put my own music on the ipod can any body help me. thank u Ipod family

    Welcome to the Apple Community.
    Which method did you use for moving your library.
    Moving your iTunes library to a new computer

  • ASCP Yield and Resource Lead Times

    Hi,
    I am working on Advanced supply Chain Planning on R12. Both ERP Source and Planning are on different instances. We are also using Oracle Shop Floor Management. In the Network Routings we have included a lead time resources at step 10, 20, 30. The resources are as follows:
    These resources are defined as 'Machine'
    Resource for operation 10: ASSY-LT, 112 hours, Schedule 'yes'
    Resource for operation 20: TEST-LT, 64 hours, Schedule 'yes'
    Resource for operation 30: PACK-LT, 48 hours, Schedule 'yes'
    We also have only one shift with 8 hours. The resources are not available for 24 hours.
    We also have yield at each of the above operations.
    Lead time rollup calculates 28 days for the item correctly.
    We are encountering a problem with ASCP Plan. The start dates and yield calculations using constrained plan with Enforce Demand Due Dates constraint seems problematic. If we check 'Calculate Resource Requiremnents' option, the plan calculates yield but the lead time gets reduecd by one third and start and completion dates for a planned order (and ATP etc.) are not accurate.
    If we uncheck 'Calculate Resource Requiremnents'option in the plan, lead times are accurate but plan does not calculate yield.
    We are not sure what is causing the problem.
    Does anybody know what options or profiles control this behaviour?
    Your help is highly appreciated.
    Thanks.
    Vijay

    Hi Vijay,
    important for your lead time calculation is the setup of the calendar and shifts and what shifts are assigned to the resource.
    You say you do not have enabled the 'Available 24h' option. We are also using leadtime resources in certain cases, for example LT for Assembly. But in these cases we enable the 24H option. Next to this we set the Resource capacity to 'unlimited' by setting units to a very high number (1000 etc.) on resource level, to prevent that the system gets into capacity conflicts here.
    And the yield calculation indeed should be quite independent of the leadtime resource.
    best regards,
    David.

  • Thanks to all, but giving up on Mac and going back to windows. Had enough!!

    Thanks to all who have tried to assist with my issues with Mac movie and burning.
    In short, I have made a 105 minute movie which includes photos with transitions, video, some titles and backing track, music. When I "share" the movie to Quicktime, it plays just fine. When I create a DVD in IDVD and then burn to a blank DVD, it does not play correctly on DVD players.
    Specifically, the sound track is erratic in volume, cuts out in parts, or the DVD crackles up and is jolty.
    ALTHOUGH it does play just fine in the Mac DVD player (not much good to those smart people that don't have a Mac).
    Based on a lot of feedback on this forum I have tried:
    - creating disk image (works fine) but has same issue during play back on standard DVD
    - using verbatim quality disks
    - using professional quality in project properties
    - Simplifying the menu option
    - Ensuring I avoid drop zones when putting movie into IDVD
    - Ensuring I have minimum 25GB of free hard disk space (actually have over 140GB)
    - and most recently, changing the BURN SPEED to 1X.
    Changing the burn speed to 1X did actually mean that the DVD played back fine on one of our DVD players. ALthough subsequent copies have cracked up a bit towards the end of the DVD.
    When we try it in different DVD players, we tend to get different issues!!!!
    Given that we are wanting to distribute the DVD to other families, we do not know what they are going to get. ie. it plays back differently on each DVD player we try.
    Never used to have same issues with windows, plus find IMovie very limited in what you can do.
    Back to windows for the next DVD, write this one of as a bad experience and a waste of money buying the IMac. Will NOT be recommending Mac at all.

    How new is your iMac? I wonder if you have a bad burner. What you describe in terms of different players playing discs differently is really odd. I have authored over 40 DVDs of family stuff, DVDs for the soccer team I coach, etc. all on the Mac and without any problems with any player I've tried, and family members have used them as well without problems.
    If you just bought the iMac you might consider seeing if they will replace the superdrive, though it sounds like you have made the decision to go back to Windows. From everything I hear (and no, not just from Mac people), Windows software is very challenging to use and yields very limited results - but good luck to you if you go that route.

  • Lenovo 3000 N200 ( 0769- A63) 0769 BAG processor upgrade limit and type

    i have folloing system properties
    TYPE
    0769-A63
    07/12
    Name
    Lenovo 3000 N200 Notebook (0769-A63)
    Specs
    Based on 0769-BAG: T2330(1.6GHz), 2GB RAM, 120GB 5400rpm HD, 15.4in 1280x800 LCD, Intel X3100, CDRW/DVDRW, Intel
    802.11abg wireless, Bluetooth, Modem, 10/100 Ethernet, Secure chip, Fingerprint reader, Camera, 6c Li-Ion, DOS
    License
    Chipset: Intel 965 Express Chipset probably
    i want to upgrade my processor to core2duo etc
    what will be the type/types of processor/s that can be used to upgrade it and what is the upper limit of speed of processor that can be attached
    thanks
    Moderator Note; s/n edited for member's own protection

    Wow, thank you, what an awesome little tool!
    So, I found that these 2 RAM products are compatible with my laptop:
    Part Number: CT758799
    Module Size: 2GB
    Package: 200-pin SODIMM
    Feature: DDR2 PC2-6400
    Specs: DDR2 PC2-6400 • CL=6 • Unbuffered • NON-ECC • DDR2-800 • 1.8V • 256Meg x 64 •
    Part Number: CT699699
    Module Size: 2GB
    Package: 200-pin SODIMM
    Feature: DDR2 PC2-5300
    Specs: DDR2 PC2-5300 • CL=5 • Unbuffered • NON-ECC • DDR2-667 • 1.8V • 256Meg x 64 •
    I then went to a well-known auction site and found RAM cards from the same manufacturer with exactly the same specs as above but with different Part numbers. Does it have to be the exact part number or do I stand a good chance of it working if it has the required spec? I searched for the exact part numbers and yielded 0 results.
    Thanks for all your help on this.

  • Report for deviation of yield

    Hello Gurus,
    Is there a standard report available in ECC6.0 to check all the process orders that had deviated from the yields.
    Thanking you in anticipation
    Sri.

    Dear,
    Try transaction code MCPM also use MB51 with relevant movemt type for scrap and Yield, you will get data.
    Please refer this thread,
    Re: Yield Analysis Report
    Regards,
    R.Brahmankar

  • How can I architect my data layer to yield query result pages to the application as SQL Server prepares them?

    I tried to make the question as explicit as possible.
    Refer to Sql Server Management Studio's Results view.  Depending upon the structure of the execution plan, the Results pane may begin displaying results while the query is still executing.  Can someone point me in a direction for architecting a
    data layer (I am tech and framework agnostic for this task. Any solution will suffice) that will begin receiving pages of the set before SQL Server has completed the entire query?
    The call from the data layer to SQL Server will obviously have to be asynchronous, but is there any additional ceremony that I need to be aware of when issuing OPTION (FAST x) to the query optimizer?

    Thanks for the reply. (I actually meant to put this in the SQL Data Access forum, not the T-SQL forum)
    "Generally the last step is ORDER BY in a
    query, so nothing can start before that executes."
    I would imagine you cannot ORDER BY and yield results as they are fetched because of the execution plan that would be generated.  For the purposes of this post, please assume that sorting will be done purely client side
    "Can you post your query?"
     For purposes of discussion, let's assume that the query is
    select *
    from information_schema.columns
    and also assume that you have "lots" of columns to display.
    This was an exploratory question to see what would be necessary to replicate the behavior of Management Studio's Query Result view in a custom application. 
    I would imagine that there's going to be a lot of analysis of the execution plans that get generated in order for the OPTION (FAST x) optimizer hint to do any good, but apart from general tuning concerns that would allow SQL SERVER to yield a page of data
    "fast", I was wondering if there was anything else required of the calling client to force it to yield return its first page.
    After thinking about this (and phrasing it the way I did in the last sentence) perhaps this is the incorrect forum for this question.  I imagine that my concerns are better addressed in forums dedicated to the technology of the calling client (which
    would be a .NET assembly)
    Be that as it may, if there is any ceremony that SQL Server imposes on clients in order to yield return, I would expect that my question would be in the scope of SQL Server discussions (even though I intended this to be in a different SQL Server forum)

  • Unable to Install Windows Server 2012 R2; system aborts and reboots after first blue Windows logo appears

    System:  Dell PowerEdge sc1420 with dual xeon and Adaptec 2420SA SATA RAID (supported by OS); 10 GB memory (2x4, rank 2, organization x4, in DIMM1 & DIMM2; 2x1, registered, organization x8, in DIMM3 & DIMM4)
    This computer is currently running Windows Server 2008 R2 Enterprise which was installed with no problems.
    Objective: clean install of Windows Server 2012 R2 Standard on an otherwise unused RAID array disk set.
    All attempts to boot from DVD result in loading of files (grey progress bar on bottom of screen), brief dark screen, then black screen with a blue windows logo for about 2 seconds and then a flash of about 10 lines of error notifications on a black screen
    for about 1/4 second (unreadable), and then a reboot of the computer.
    Coreinfo.exe confirms that cpus DO support NX and PAE; do NOT support VMX (virtual machine enhancements).
    Attempts to run memtest.exe from boot disk tool-menu startup result in an error message that the memtest.exe file is corrupt.
    Running memtest.exe from Windows 2008 R2 install disk results in all memory tests passed!
    Attempted booting with multiple DVD's (some +R, some -R, all verified) burned from .iso.  These DVD's were used successfully to install Windows Server 2012 R2 on an HP Pavilion.  Also changed DVD drives just to rule out DVD hardware. 
    Running CHKDSK from Windows 2012 R2 on HP Pavilion shows no problems; running CHKDSK from Windows 2008 R2 on the Dell system on the same disks shows some problems.  Running checkdisk on the Windows 2008 R2 install disk from the Windows 2008 R2 installed
    system shows no problems.  All disks burned and verified on same system.
    Also attempted to boot from USB thumbdrive with copy of DVD on it.  Same results: system loads files then reboots at first Windows logo.  So that would rule out disk quality issues per se??
    Started to try an upgrade rather clean install to see if any errors were announced.  None were announced as the system went through much of the process.  I aborted somewhere along the way before committing to the upgrade because 1) the most likely
    outcome was it would not boot after install (I don't really need the practice in restoring); and 2) even if it did reboot, having a system that can't be repaired by booting the install disk is pointless.
    All disks and raid arrays are recognized and usable by RAID controller card and by Windows Server 2008 R2.  Disks are within spec for use with Adaptec 2420SA (1TB @ 300).
    Is VMX (aka VT-x) actaully required for ANY install even though Hyper-V is not intended to be used?  If so, it is too bad that the Microsoft system requirements (
    http://technet.microsoft.com/en-us/library/dn303418.aspx ) don't make that clear.  And lack of VMX support really doesn't explain the memtest.exe "corruption" issue. 
    Or is it maybe something unique about the files on 2012 R2 disk (or disk image on USB) that are causing some problem with the chipset processing?  But why?
    My vote would be for the latter of the issues, but I have no clue why or if it is remediable.  Suggestions? 

    So here is what I think is the final understanding of this problem:
    First, one additional piece of important information:  The computer successfully moves into and through the Windows Boot Loader phase and succeeding phase ONLY for Windows 8.1 32-bit, but NOT for Windows 8.1 64-bit nor for Windows
    Server 2012 R2 (only 64-bit).
    The Windows Boot Manager phase (which precedes the Windows Boot Loader phase) is either 32-bit or 64-bit (64 –bit for this case) and is loaded by the 16-bit stub program (Bootmgr) which starts in real mode. 
    Windows Boot Loader of course runs in either 32-bit or 64-bit (64 –bit for this case) according to the product being installed.
    By inference, the essential problem is occurring in conjunction with the loading of a WIM file to start the Windows Boot Manager. 
    The boot manager starts ok and generally shows its essential screens (Windows Boot Manager; Advanced Boot Options) or proceeds to load files for transition into Windows startup in the succeeding Windows Boot Loader phase (that doesn’t “boot” but rather
    loads the system).  This is best confirmed by the announced corruption of the memtest.exe file when selecting Windows Memory Diagnostic from the Windows Boot Manager Screen. 
    If allowed to continue loading files to pass control to the Windows Boot Manager, the type of abort and resulting immediate restart that occurs is that which you would associate with unhandled cpu exceptions (invalid instruction, memory out of range,
    wild interrupt, etc.) that most of us have not commonly seen for 20 years (since beginning to use well behave OS’s from Windows NT 3.1 and on).
    So, the problem obviously is due to a failure to properly mount the WIM file and/or properly access it (probably the latter) when running in 64-bit protected mode. 
    This is independent of the hardware on which the install disk is actually mounted (DVD or USB-flash), so it is NOT a hardware problem
    per se.
    Since the WIM is mounted early in the process, Boot Manager may well have loaded it using BIOS routines to access the physical device on which it resides rather than loading 32-bit or 64-bit drivers of its own. 
    Hence, if the BIOS does not “mount” the WIM in a way that is later fully compatible with access from the 64-bit systems being loaded, it could cause apparent file corruption. 
    If it is only partially incompatible some functionality may appear quite usable (like loading files) until later detected. 
    This is vague on my part, because the exact nature of the incompatibility cannot be readily determined from the information available. 
    But the consequence is the same: the system cannot boot from the install disk, either for initial installation or repairs.
    Incidental conclusion:  The 64-bit boot manager code for Windows Server 2008 does not exhibit this faulty behavior, but the 64-bit code for Windows Server 2012 does. 
    So the problem is not inherent with loading 64-bit server OS’s.
    Expectations:  If the system were to be upgraded using the features of the install disk while running an installed OS, say Windows Server 2008 R2, it might well succeed and yield a fully functional system upgraded to Windows Server 2012
    R2 (because the installed operating system is started from discrete files, not from a WIM file). 
    However, in order to perform a disaster recovery using Windows resources (such as image backups from Windows Server Backup), one would have to presumably boot from a Windows Server 2012 64-bit install disk or Windows Server USB recovery drive. 
    Of course we know that the Windows Server install disk will not boot and we can be fairly sure that the problem will migrate to any Windows Server USB recovery drive that is created. 
    My inspection of a Windows 7 System Repair CD shows that is based exclusively on boot.wim, not discrete files! 
    Of course a generated image for a WIM for Windows 2012 USB recovery drive
    might not have the flaw that drives this behavior, but it probably would.
    Problem conclusion:  A BIOS upgrade would be required for compatibility with Windows Server 2014 clean installation
    and any maintenance; and none is available from the OEM.
    Problem not resolved, but understood for future reference.
    Thanks to Tim whose comments helped me focus my thinking.

  • Ideas to improve the Rescue and Recovery System

    Hi,
    some weeks ago I ned to restore my system, while this process there was some difficulties with the Rescue and Recovery. But First my configuration:
    I'm using a T61p with Windows Vista Ultimate installed.
    On the System there are 5 Partitions
    1. Rescue and Recovery
    2. Windows Bootstrap
    3. BitLocker Encrypted Windows Partition
    4. Linux /boot
    5. Encrypted Linux LVM
    So, one morning I tried to start my ThinkPad and nothing worked anymore. After a Short time I found out that the boot sector of the windows bootstrap partition was corrupted. The Linux Partition worked fine and was startable with a GRUB CD-Rom.
    Ich checked out the Rescue and Recovery system and was sad that I isn't possible to recover the boot sector of the bootstrap partition. The only way to rescue my system was to reinstall windows at all!
    Thats a bad solution, because you can recover the boot sector with every OEM-DVD of Windows Vista, but this DVD is not available for a ThinkPad user.
    So I backuped my files and reinstalled the system. While the reinstallation process rescue and recovery gives the option to install windows on C:. But on such a multiboot configuration it is not very clear what drive C: is. After some experiment I think he is taking the first parition after Rescue and Recovery or the first free space after it. At this place it would be much better if I can choose a parition for the installation.
    So, if a Lenovo engineer is reading this, please think about to:
    - Include some tools like boot sector recovering in the Rescue and  Recovery system
    - Adding a mode where you can choose the parition to install Windows.
    Over and above that I'm really happy with my ThinkPad.
    Thanks for reading.

    Greetings,
    This isn't so much a reply to Cobelius, or a solution, but a commiseration of sorts.  Or at least an "I hear ya, buddy, I wish they'd improve some things about Rescue and Recovery, too."
    I notice that no one else has replied to this thread.  I hope at least that someone will read these posts, and pass 'em on to the appropriate developers.
    Here's my story:
    I am, overall, extremely happy with my brand, spanking new T500.  So far, I'm only running a single OS, Vista Business 64.
    However, I did make the following 'tweaks' to my setup, which brings out the flaws in the Rescue and Recovery software:
    1) I used Truecrypt 6.1a to encrypt my system partition (leaving the original rescue partition intact)
    (Side Note:  Where my original drive C came with tons of empty space, I used DISKPART to shrink it a bit an created another primary partition as drive D.  Hey, why the heck not?  Originally, I wanted TrueCrypt to encrypt the entire drive, but this failed.  Truecrypt said this was due to a hardware failure, but 20 hours of my life wasted scanning the drive with HDD Regenerator 1.51 revealed no bad sectors.  So now Drive D is encrypted as an ordinary TrueCrypt container and mounted upon login. I'm okay with this solution,  but if anyone wants to research why TrueCrypt couldn't do the whole disk, you have my thanks in advance.)
    2) I bought a Maxtor BlackArmor(tm) hardware-encrypted USB external drive to use for my backups.  I successfully used the Lenovo Rescue and Recovery tool to both create specific file backups as well as to image my entire drive C onto the BlackArmor device, once I mounted it (by running a built-in executable and supplying it with my secret password)
    3) I set a Lenovo BIOS password for unlocking the boot hard drive.
    So far, so good.  I type the BIOS drive password, then theTrueCrypt password, and Vista takes over 12 minutes to become usable and, well, that's a Microsoft issue, isn't it?
    But, here's the deal: WHAT IF I CAN'T BOOT FROM THE HARD DRIVE ANYMORE?  Well, TrueCrypt made me create a bootable rescue CD which can restore my encrypted boot sector or permanently decrypt my hard drive.  What it can also do is let me press the ESC key to boot without providing a password, which does a supposedly wonderful thing -- since it fails to boot my drive C, it takes me straight into the Lenovo Rescue And Recovery session.
    The problem is, the Lenovo boot (version 4) takes me AUTOMATICALLY into a self-repair utility which wastes another 10 minutes of my time to finally tell me --d'uh-- that my boot sector is 'corrupted'  (it isn't; it's merely ENCRYPTED by TrueCrypt.  As it should be).
    FLAW NUMBER 1:  Rescue and Recovery should provide a default menu where automatic diagnosis is a CHOICE the user can elect not to make, or abort if it has begun.
    FLAW NUMBER 2: You guys should enhance what appears to be a Vista PE environment (or BartPE, or whatever) with a few utilities, including an Explorer-like browser to examine the disk (which would snow nothing in my case, as my drives are encrypted),  and a command prompt, and the ability to launch executables.  The environment should also recognize USB drives.
    As it happens,  I used a third-party Vista PE rescue CD, Active@ Partition Recovery, to boot an environment having the utilities I needed.  Using this tool, I was able to recognize a USB key that had TrueCrypt installed in 'traveller' mode (no windows registry keys needed).  Running this app and providing my truecrypt password enabled me to unlock my Drive C, albeit by mounting it as another drive letter.  This made it possible to read and write to my disk.
     The encrypted Maxtor drive is visible, too, from this boot environment,  but it appears as a CD-ROM drive, and yields no secrets until you run an executable file on it and enter the drive's password. And that worked, too.  So I could find files to copy over to my hard drive, if I needed to.
    However, in order to RESTORE files to my drive, I'd still need to be able to run Rescue and Recovery, or at least the Recovery portion.  Which  brings me to:
    FUNDAMENTAL FLAW NUMBER 3:  You need to create a Recovery tool that can run from within an external USB drive, without requiring shared .DLLs in various subfolders or windows registry entries.  Lord knows, I certainly tried to copy  RestoreNow.exe from C:\Program Files (x86)\Lenovo\Rescue and Recovery onto my Maxtor drive, along with whatever DLLs it called for, but it was hopeless.  The damned thing just wouldn't run from the USB drive on the PE environment.
    The way I see it, then, is that there really is no way to restore my hard drive should I suffer a catastrophic failure.  Major bummer.
    FUNDAMENTAL FLAW NUMBER 4: When I got my laptop, I created a series of recovery disks using the Lenovo supplied software. Booting with the first CD, however, only proved depressing, as I had to agree that the ONLY purpose for this CD was to recover my PC to factory-shipped condition.  Which meant erasing my drive and restoring it with the contents of the other DVDs.  But I want to restore what I backed up with Rescue and Recovery!  My life SINCE the laptop shipped from the factory!   Geez, Louise, why not let me choose to do THAT, too?
    (Of course, the boot CD would need to allow me to execute the Maxtor program that unlocks my encrypted USB drive, and also let me execute TrueCrypt to unlock my hard drive.)
    I sure hope someone forwards this to a developer who'll take it seriously, after s/he stops chuckling.  You guys should just ship a usable Vista PE rescue CD with every laptop, I think.  And a recovery-only tool that doesn't need DLLs or Registry Entries to run. It would help in so many ways.
    In the meantime, I guess I'd better do all I can to ensure my laptop's hard drive doesn't die on me.
    Again, other than for this teeny problem that "don't amount to a hill o' beans in this crazy world," I'm really happy with my T500.
    Thanks for reading this, too.  And happy holidays.

  • Video card news and CS5 / MPE

    With the upcoming launch of CS5 and the Mercury Playback Engine, this may be interesting.
    Side note: We do not yet know whether FERMI cards will be supported.
    The GTX480 has simply the fastest GPU. An average speed gain of about 15 percent from the HD 5870. The ATI costs around $ 400, while for Nvidia's latest about $ 500 is to be paid. So far there is not anything remarkable, because in hardware land there is a premium for the fastest of the fastest in our homes.
    The HD 5970 throws a spanner in the works. This dual-GPU card actually leaves the GTX480 in the dust, the performance of ATI's top model is on average around 35 percent better. The HD 5970 is again one hundred U.S. dollars more expensive than the GTX480, but both the HD 5870 and HD 5970 offer more performance per dollar.
    There are obviously more factors than price-performance ratio. The GTX480 however loses when it comes to energy usage, even when the benefits are taken into consideration. Indeed, energy consumption is only marginally lower than that of the faster HD 5970. The associated noise is also a drawback; here the HD 5870 and HD 5970 cards are the clear winners.
    Then there's more to be considered. AMD now has the Eyefinity technique with which three displays can be controlled. Nvidia counters with 3D Surround Vision: this could also be driving three screens, but there are two graphics cards in sli needed. With three 120Hz displays it is even possible to provide three displays of 3D images.
    Prospects appear bleak for the GTX480, but the card can not be discarded - at least not until the GTX480 is in the shop and the actual price is clear. If retail prices are slightly below the recommended price, while the HD 5870 prices are kept artificially expensive because of high demand, Nvidia's latest offer may be a good option. Nvidia has already stated that the cards will be fully stocked from April 12. In the more distant future may be Nvidia can improve drivers and  yields, which can push down the price further and further to improve performance. How much performance gain the card can achieve is the question. Who wants the fastest of the fastest can better invest in the HD 5970.
    Now the question arises:
    Assuming MPE will support the FERMI cards, which one is better:
    1. The ATI HD 5970 is fastest but does not benefit from CUDA and supports DX11
    2. The GTX 285 is much slower but benefits from CUDA and supports DX10
    3. The GTX 480 is slower than ATI and may benefit from MPE and supports DX11
    These are interesting times for all that want to change their graphics cards. The waiting is for some benchmark tests to shed some light on what is ahead for us.
    And let's hope Adobe extends the MPE option to include ATI cards...

    the specs on the HP systems are pretty weak.
    a Xeon 3503? no such thing.. do you mean 3530?
    a Xeon 3530 is the exact same thing as a Core i7 930
    ECC ram will slow the system down and is not needed
    8 gig ram is wrong for a 1366. it needs to be triple channel so 6 or 12 gig.
    bare minimum is 3 hard drives.
    as mentioned that video card is a joke. HP has a bad habbit of selling low budget card or very expensive cards.
    dont buy into the "quadro" name and this all is well.
    better to buy from a custom NLE builder
    Scott
    ADK

  • I got a email from "find my iphone" saying my iphone was erased...and it was! how did that happen?

    I got an emai from "find my iphone" saying my phone was erased.. and it was!  Why did this happen?

    One - you may have said to disable/erase the iPhone while communicating with the iCloud Find My iPhone site.
    Two - the finder/thief may have reset the iPhone to factory settings, simple enough to do, and yielded the iPhone untrackable.

  • Help - option premium calculator configuration and testing

    Hi Experts,
    I've been reading up on Option premium calculator in saphelp.com but it doesn't give me pre-requisites that are to be maintained first before TXAK can be used perfectly.
    Would like to request for assistance in 1) customizing/configuring and 2) testing the Option premium calculator..
    Customizing done so far (using DUMMY Values):
    A) Yield Curve type 9999, Curr SGD maintenance in table JBD14
    B) Reference Interest rate for yield curve 9999 via txn JBYC
    customizing via SPRO (using DUMMY values):
    C) Define reference interest rate (NOMVAL_SGD)
    FSCM > TRM > Basic functions > Market data mgmt > Master data > settings for ref. interest rates and yield curves for analyzersMaintain Exchange rate volatilities
    D) Enter exchange rate volatilities for transaction currencies
    FSCM > TRM > Basic Functions > Market data mgmt > Manual market data entry > Statistical data > Enter exchange rate volatilities
    Please do let me know what other data I SHOULD MAINTAIN aside from the items above..
    Thanks!
    David

    Hi David,
    strange, it says you posted it last year, but I haven't seen your thread before.
    For testing, see links: http://help.sap.com/erp2005_ehp_04/helpdata/EN/d2/6f7cb2415e11d182b10000e829fbfe/frameset.htm
    http://help.sap.com/erp2005_ehp_04/helpdata/EN/d2/6f7cb2415e11d182b10000e829fbfe/frameset.htm.
    But I suppose you already got some answers in your other post:
    Re: Option premium calculation for OTC Currency options
    BR, Tomislav

  • Lenovo ix4-300d and 4 Hd 6 tb red

    Hi first of all sorry for my english, 
    I have 4 Nas ix4-300d,
    3 with 4 green 3 tb Hd they work perfectly never have any problem in 3 year. I work in raid 0 because
    I duplicate the files in other HH and I want the maximum of space available.
    the fourt one I bought 3 days ago and I try to install 4 6Tb red wd inside it in Raid0 here is the problem...
    I can't create shares becouse of this message :
    "The selected function is not available due to the state of the pools"
    So I look on the internet and see that maybe I can't work in raid0 if the disk is bigger than 4 TB (CRAP)
    then I try to put Raid 5 ( the only I can choose beside Raid 10 )  but in 36 Hours the data recostruction
    is only 19 %     with 0 files on it.......................
    Can someone Help me please?
    Can someone give me some advise?
    Did you know a Nas that work in Raid0 with 6tb and 8 tb disk?
    Ty in advance.

    Hello Nico84
    1.  The ix4-300d will probably be able to use 6 TB HDD's if only 2-3 were installed and depending on the protection mode( 2x 6TB disks RAID 0, 3x 6TB disks RAID 5).  However the limitation your are running into is that the the ix4-300d's Marvell Armada XP 1.3GHz MV78230 processor is only 32 bit and has the 16 TB ceiling.  
    2.  The ix2-dl and px class devices can officially support 6TB HDD's in RAID 0 or NONE. 
    3.  Please check with these retailers/resellers about lenovoEMC devices in your region.  Where to Buy
    4.  With any 32 bit system the capacity limit for a volume is 16 TB.  With the ix4-300d the max supported is 4x 5TB disks in RAID 5,  this would roughly yield a 15 TB volume.  
    What you are attempting with the 4x 6TB HDD's in RAID 5 would roughly yield a 18 TB volume, which is above the 16 TB volume ceiliing for the unit's processor.
    Again using 3x 6TB HDD's should theoretically work in RAID 5 and yield about a 12TB volume.  If you were to use just 2x 6TB HDD you should theoretically be able to have a 12TB RAID 0 array. Both of which would be under the 16 TB cap.  
    If you are fully intending on using 4x 6TB or larger disks, then a px class unit will need to be used. 
    I hope this better explains the situation you are having.
    LenovoEMC Contact Information is region specific. Please select the correct link then access the Contact Us at the top right:
    US and Canada: https://lenovo-na-en.custhelp.com/
    Latin America and Mexico: https://lenovo-la-es.custhelp.com/
    EU: https://lenovo-eu-en.custhelp.com/
    India/Asia Pacific: https://lenovo-ap-en.custhelp.com/
    http://support.lenovoemc.com/

  • BUG: Play in order (songs and podcasts)

    I manually load my shuffle with podcasts and music. I set the play order while the shuffle is still connected to my Powerbook: I want my songs to be first, then the podcasts. Everything seems okay when I eject the shuffle.
    I reset the playlist (three quick clicks) expecting the first song in my playlist to begin playing. Instead the first podcast actually plays! Podcasts always play first in spite of my manual re-order of the playlist.
    Also, while I can re-order songs in my playlist, I cannot re-order podcasts. In other words, songs, which I can manually re-order, do not play until after the last podcast plays. I cannot change the play order of the podcasts!
    The reason I call this a bug is because iTunes allows me to manually drag selections around to re-order a playlist but, it does not actually transfer my changes to the shuffle.
    Am I missing something or, is this the way it's supposed to work?

    Podcasts play in something like the order they were downloaded. None of the sort columns will replicate that order, but it's in there, and there's no way to put podcasts into an iPod shuffle and have them play in any other order. Most frustrating.
    What's needed is a quick and simple way to prevent iTunes and/or the iPod from recognizing podcasts, and treat them instead like any other mp3. I tried some simple things like changing the genre label from "Podcast" to something else, and moving the file out of the "Podcasts" folder in iTunes Music, but neither did the trick. I unchecked the two options "skip when shuffling" and "remember place" in the podcast's Get Info Options tab, but that didn't work. The file is identified as a regular mp3 file with the standard .mp3 filename extension, so it's not a case of needing to convert it from some exotic .mp? variant.
    So it seems that whatever tags the file as a "podcast" is not accessible to the user in iTunes at least, unless someone can please prove me wrong. I haven't tried altering the Type/Creator codes for the podcast files in Finder -- anybody else tried that?
    One thing finally worked, but it's somewhat cumbersome. I used the excellent shareware application MP3Trimmer to "repair" the podcast mp3 file. What this does whether the file is actually "broken" or not is strip out all non-audio information and yield a pure, clean raw mp3 file with all other tags and info removed. I put this file into iTunes, and voila! It's no longer recognized as a podcast file, and can be sorted into playlists on the iPod just like any other normal mp3 file.
    Unfortunately, MP3Trimmer does not yet have a batch mode, which would be a huge help since the "cleaning" takes a while. But the MP3Trimmer developer, very friendly and helpful fellow, says that batch capability is at the top of his priority list and the next major update will have it. He updates the program pretty regularly, so I would guess that this should happen within the next few months -- maybe tomorrow for all I know!
    If anyone knows of any other mp3-editing software that can similarly strip an mp3 file down to pure audio, I'm sure it would serve just as well, and if it already has batch capability -- well, please let me know!
    Note that the MP3Trimmer solution has the advantage of not degrading the audio in any way. One could for example convert the podcast mp3 to a WAVE file and then back to an mp3 and most likely this would also succeed in removing any podcast identification from the file, but you'd suffer another audio fidelity loss in that additional conversion to mp3, and that's no good, is it?
    iMac slot-loading G3 500 MHz   Mac OS X (10.2.x)  

Maybe you are looking for

  • Using Dreamweaver and Contribute at the same time

    Hello! I just installed the Adobe Creative Suite 3 Master collection and when I am using Contribute and try to edit the pages in dreamweaver by using the edit in external application an unknown error occurred in the contribute state manager. Could so

  • Exchange Email not working in Mail v5.0

    I am trying to forward my school email address which uses exchange to the mail app. I have done this successfully on my older mac that is running Mail v4.5. However, when I try it on my macbook it does not work successfully and it has a pop up window

  • Drag a JLabel, drop in a JList

    I'm trying to implement a drop-and-drag feature in my program. I have a class Book, which extends a JLabel with and ID field (as a string) and a get and set method for the ID in it, as well as a toString method which returns the ID. I need to drag th

  • Authorization object to control BOM usage ?

    Hi , Through which authorization object we can control the authorization for BOM usage ? We have to control authorizations for CS01 through BOM usage Production , PM ,Sales BOM etc... regards, madhu kiran

  • About multiRef

    Can bpel convert multiref response of service in good struct!?