Profile Performanc​e and Memory shows very large 'VI Time' value

When I run the Profile Performance and Memory tool on my project, I get very large numbers for VI Time (and Sub VIs Time and Total Time) for some VIs.  For example 1844674407370752.5.  I have selected only 'Timing statistics' and 'Timing details'.  Sometimes the numbers start with reasonable values, then when updating the display with the snapshot button they might get large and stay large.  Other VI Times remain reasonable.
LabVIEW 2011 Version 11.0 (32-bit).  Windows 7.
What gives?
 - les

les,
the number indicates some kind of overroll.... so, do you have a vi where this happens all the time? Can you share this with us?
thanks,
Norbert
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.

Similar Messages

  • Profile Performanc​e and memory

    Hi, guys
    I am trying to analysis my Profile Data for the Profile Performance and memory.
    When I check the data for VI Time, I found there are same VIs with different VI Time, what is that mean
    The reason why I am doing this is because I am going to add pulse model for my VI, so I need to know the relationship between my all instruments and time, Am I in the right track
    Thanks
    Solved!
    Go to Solution.
    Attachments:
    VI Time.png ‏72 KB

    It looks like the timing had a resolution of 15.6 ms, which I think is common on some flavors of the Windows OS.  Note that the average times on some of those VIs is zero.  This suggessts that for many of the 1878 runs the time was probably recorded as zero and a few were at 15.6 ms.
    This is one of the limitations of the Profiler.
    Lynn

  • Acrobat xi makes my icons very large and the program very large.  It is like everything is super big.  Then it really does not work well to view files.  What can I do?

    My Acrobat XI enlarges my desktop, and the program is also very large.  I cannot see the whole screen because it is so large.  When I end Acrobat everything returns to normal.  I just want to use Acrobat XI to read pdf files.

    Hi Pat, that is the problem I am having.  I have tried to change my scale for screen settings to 100%, but every time I change it, the next time I open the program, it reverts to system settings.  It is not saving the new change I made.  I have looked all over for a save command but there isn’t any.  I even went to my regedit settings and did exactly what the instructions say.  Still the same problem.  Everything is at least 200%.  Once I exit the Reader, everything reverts to normal.  What am I doing wrong?  Or what else can I do?
    Thanks.
    Paul Bardotz

  • IMac  with SSD and Hard drive very slow startup time

    I have a new mid 2011 iMac with a 250gb SSD drive and a 2TB drive. All are from Apple as BTO options. When I first used the iMac my typical startup time from the hardware check chime to desktop was typically 12 seconds. Something, somehow has been changed because now it takes over one minute to boot up from the SSD. After the chime I hear the fans running and also the 2tb drive.  After about 20 seconds I hear the hard drive spin down and then after another 20 seconds it starts up again. Once it starts up the boot up sequences contiunes and returns to the desktop. After this the iMac perfprms as normal. I have tried running disk utility from the recovery partition. I have also reinstalled Lion as well from the recovery partition and neither of these change the outcome.
    When I first got the iMac I used time machine to restore from a previous machine into the 2tb partition and then went into advanced options in System preferences>Users&Groups to point to my home folder on the 2Tb drive rather than the default one on the SSD. That was working fine until I discovered software problems with both Preview and Textedit where they would not start up with this arangement. I then repointed my home folder to the default one on the SSD which cured this issue with these programmes leaving my data remaining on the 2Tb drive. Since doing this I have had the one minute boot up problem.
    Can anyone offer any advice as to how I may overcome this problem and get my iMac back to starting up in twelve seconds from the SSD as it did before.  Any help would be appreciated on this topic.
    Thanks.

    Where do you maintain your Home folder - on the SSD or on the HDD?
    If you want to do a clean install (from scratch) then do the following:
    Install or Reinstall Lion from Scratch
    If possible backup your files to an external drive or second internal drive.
    Boot to the Recovery HD:
    Restart the computer and after the chime press and hold down the COMMAND and R keys until the menu screen appears. Alternatively, restart the computer and after the chime press and hold down the OPTION key until the boot manager screen appears. Select the Recovery HD and click on the downward pointing arrow button.
    Erase the hard drive:
    Select Disk Utility from the main menu and click on the Continue button.
    After DU loads select your hard drive (this is the entry with the mfgr.'s ID and size) from the left side list. Note the SMART status of the drive in DU's status area.  If it does not say "Verified" then the drive is failing or has failed and will need replacing.  SMART info will not be reported  on external drives. Otherwise, click on the Erase tab in the DU main window.
    Set the format type to Mac OS Extended (Journaled.) Click on the Erase button and wait until the process has completed.
    Quit DU and return to the main menu.
    Reinstall Lion: Select Reinstall Lion and click on the Install button.
    Note: You can also re-download the Lion installer by opening the App Store application. Hold down the OPTION key and click on the Purchases icon in the toolbar. You should now see an active Install button to the right of your Lion purchase entry. There are situations in which this will not work. For example, if you are already booted into the Lion you originally purchased with your Apple ID or if an instance of the Lion installer is located anywhere on your computer.
    Before you go to the extreme of the above would you do this:
    Open Activity Monitor in the Utilities folder.  Select All Processes from the Processes dropdown menu.  Click twice on the CPU% column header to display in descending order.  If you find a process using a large amount of CPU time, then select the process and click on the Quit icon in the toolbar.  Click on the Force Quit button to kill the process.  See if that helps.  Be sure to note the name of the runaway process so you can track down the cause of the problem.
    Just in case your system slowness is due to something eating up CPU time. Oh, and if your system is brand new check the upper right corner of the menubar for a "dot" inside the Spotlight glass. This means Spotlight is indexing the drive. Until it finishes your system will be quite sluggish.

  • I want to load large raw XML file in firefox and parse by DOM. But, for large XML file the firefox very slow some time crashed . Is there any option to increase DOM handling memory in Firefox

    Actually i am using an off-line form to load very large XML file and using firefox to load that form. But, its taking more time to load and some time the browser crashed. through DOM parsing this XML file to my form. Is there any option to increase DOM handler size in firefox

    Thank you for your suggestion. I have a question,
    though. If I use a relational database and try to
    access it for EACH and EVERY click the user makes,
    wouldn't that take much time to populate the page with
    data?
    Isn't XML store more efficient here? Please reply me.You have the choice of reading a small number of records (10 children per element?) from a database, or parsing multiple megabytes. Reading 10 records from a database should take maybe 100 milliseconds (1/10 of a second). I have written a web application that reads several hundred records and returns them with acceptable response time, and I am no expert. To parse an XML file of many megabytes... you have already tried this, so you know how long it takes, right? If you haven't tried it then you should. It's possible to waste a lot of time considering alternatives -- the term is "analysis paralysis". Speculating on how fast something might be doesn't get you very far.

  • In Mail, one mailbox for Recovered Message (AOL) keeps showing 1 very large message that I cannot delete. How can I get rid of this recurring problem, please?

    In Mail on iMac, successfully running OS X Lion, one mailbox on My Mac for "Recovered Messages (from AOL)" keeps showing 1 very large message (more than 20 Mb) that I just cannot seem to delete. Each time I go into my In Box, the "loading" symbol spins and the message appears in the "Recovered Messages" mailbox. How can I get rid of this recurrent file, please?
    At the same time, I'm not receviving any new mails in my In Box, although, if I look at the same account on my MacBook Pro, I can indeed see the incoming mails (but on that machine I do not have the "recovery" problem).
    The help of a clear-thinking Apple fan would be greatly appreciated.
    Many thanks.
    From Ian in Paris, France

    Ian
    I worked it out.
    Unhide your hidden files ( I used a widget from http://www.apple.com/downloads/dashboard/developer/hiddenfiles.html)
    Go to your HD.
    Go to Users.
    Go to your House (home)
    there should be a hidden Library folder there (it will be transparent)
    Go to Mail in this folder
    The next folder ( for me ) is V2
    Click on that and the next one will be a whole list of your mail servers, and one folder called Mailboxes
    Click on that and there should be a folder called recovered messages (server) . mbox
    Click on that there a random numbered/lettered folder -> data
    In that data folder is a list of random numbered folders (i.e a folder called 2, one called 9 etc) and in EACH of these, another numbered folder, and then a folder called messages.
    In the messages folder delete all of the ebmx (I think that's what they were from memory, sorry I forgot as I already deleted my trash after my golden moment).
    This was GOLDEN for me. Reason being, when I went to delete my "recovered file" in mail, it would give me an error message " cannot delete 2500 files". I knew it was only 1 file so this was weird. Why 2500 files? Because if you click on the ebmx files like I did, hey presto, it turned out that they were ALL THE SAME MESSAGE = 2500 times. In each of those folders in the random numbers, in their related message folder.
    Now remember - DONT delete the folder, make sure you have gone to the message folder, found all those pesky ebmx files and deleted THOSE, not the folder.
    It worked for me. No restarting or anything. And recovered file. GONE.
    Started receiving and syncing mail again. Woohoo.
    Best wishes.

  • I need to sort very large Excel files and perform other operations.  How much faster would this be on a MacPro rather than my MacBook Pro i7, 2.6, 15R?

    I am a scientist and run my own business.  Money is tight.  I have some very large Excel files (~200MB) that I need to sort and perform logic operations on.  I currently use a MacBookPro (i7 core, 2.6GHz, 16GB 1600 MHz DDR3) and I am thinking about buying a multicore MacPro.  Some of the operations take half an hour to perform.  How much faster should I expect these operations to happen on a new MacPro?  Is there a significant speed advantage in the 6 core vs 4 core?  Practically speaking, what are the features I should look at and what is the speed bump I should expect if I go to 32GB or 64GB?  Related to this I am using a 32 bit version of Excel.  Is there a 64 bit spreadsheet that I can us on a Mac that has no limit on column and row size?

    Grant Bennet-Alder,
    It’s funny you mentioned using Activity Monitor.  I use it all the time to watch when a computation cycle is finished so I can avoid a crash.  I keep it up in the corner of my screen while I respond to email or work on a grant.  Typically the %CPU will hang at ~100% (sometimes even saying the application is not responding in red) but will almost always complete the cycle if I let it go for 30 minutes or so.  As long as I leave Excel alone while it is working it will not crash.  I had not thought of using the Activity Monitor as you suggested. Also I did not realize using a 32 bit application limited me to 4GB of memory for each application.  That is clearly a problem for this kind of work.  Is there any work around for this?   It seems like a 64-bit spreadsheet would help.  I would love to use the new 64 bit Numbers but the current version limits the number of rows and columns.  I tried it out on my MacBook Pro but my files don’t fit.
    The hatter,
    This may be the solution for me. I’m OK with assembling the unit you described (I’ve even etched my own boards) but feel very bad about needing to step away from Apple products.  When I started computing this was the sort of thing computers were designed to do.  Is there any native 64-bit spreadsheet that allows unlimited rows/columns, which will run on an Apple?  Excel is only 64-bit on their machines.
    Many thanks to both of you for your quick and on point answers!

  • Need help optimizing the writing of a very large array and streaming it a file

    Hi,
    I have a very large array that I need to create and later write to a TDMS file. The array has 45 million entries, or 4.5x10^7 data points. These data points are of double format. The array is created by using a square pulse waveform generator and user-defined specifications of the delay, wait time, voltages, etc. 
    I'm not sure how to optimize the code so it doesn't take forever. It currently takes at least 40 minutes, and I'm still running it, to create and write this array. I know there needs to be a better way, as the array is large and consumes a lot of memory but it's not absurdly large. The computer I'm running this on is running Windows Vista 32-bit, and has 4GB RAM and an Intel Core 2 CPU @ 1.8Mhz. 
    I've read the "Managing Large Data Sets in LabVIEW" article (http://zone.ni.com/devzone/cda/tut/p/id/3625), but I'm unsure how to apply the principles here.  I believe the problem lies in making too many copies of the array, as creating and writing 1x10^6 values takes < 10 seconds, but writing 4x10^6 values, which should theoretically take < 40 seconds, takes minutes. 
    Is there a way to work with a reference of an array instead of a copy of an array?
    Attached is my current VI, Generate_Square_Pulse_With_TDMS_Stream.VI and it's two dependencies, although I doubt they are bottlenecking the program. 
    Any advice will be very much appreciated. 
    Thanks
    Attachments:
    Generate_Square_Pulse_With_TDMS_Stream.vi ‏13 KB
    Square_Pulse.vi ‏13 KB
    Write_TDMS_File.vi ‏27 KB

    Thanks Ravens Fan, using replace array subset and initializing the array beforehand sped up the process immensely. I can now generate an array of 45,000,000 doubles in about one second.
    However, when I try to write all of that out to TDMS at the end LV runs out of memory and crashes. Is it possible to write out the data in blocks and make sure memory is freed up before writing out the next block? I can use a simple loop to write out the blocks, but I'm unsure how to verify that memory has been cleared before proceeding.  Furthermore, is there a way to ensure that memory and all resources are freed up at the end of the waveform generation VI? 
    Attached is my new VI, and a refined TDMS write VI (I just disabled the file viewer at the end). Sorry that it's a tad bit messy at the moment, but most of that mess comes from doing some arithmetic to determine which indices to replace array subsets with. I currently have the TDMS write disabled.
    Just to clarify the above, I understand how to write out the data in blocks; my question is: how do I ensure that memory is freed up between subsequent writes, and how do I ensure that memory is freed up after execution of the VI?
    @Jeff: I'm generating the waveform here, not reading it. I guess I'm not generating a "waveform" but rather a set of doubles. However, converting that into an actual waveform can come later. 
    Thanks for the replies!
    Attachments:
    Generate_Square_Pulse_With_TDMS_Stream.vi ‏14 KB
    Write_TDMS_File.vi ‏27 KB

  • ORA-00385: cannot enable Very Large Memory with new buffer cache 11.2.0.2

    [oracle@bnl11237dat01][DWH11]$ sqlplus / as sysdba
    SQL*Plus: Release 11.2.0.2.0 Production on Mon Jun 20 09:19:49 2011
    Copyright (c) 1982, 2010, Oracle. All rights reserved.
    Connected to an idle instance.
    SQL> startup mount pfile=/u01/app/oracle/product/11.2.0/dbhome_1/dbs//initDWH11.ora
    ORA-00385: cannot enable Very Large Memory with new buffer cache parameters
    DWH12.__large_pool_size=16777216
    DWH11.__large_pool_size=16777216
    DWH11.__oracle_base='/u01/app/oracle'#ORACLE_BASE set from environment
    DWH12.__pga_aggregate_target=2902458368
    DWH11.__pga_aggregate_target=2902458368
    DWH12.__sga_target=4328521728
    DWH11.__sga_target=4328521728
    DWH12.__shared_io_pool_size=0
    DWH11.__shared_io_pool_size=0
    DWH12.__shared_pool_size=956301312
    DWH11.__shared_pool_size=956301312
    DWH12.__streams_pool_size=0
    DWH11.__streams_pool_size=134217728
    #*._realfree_heap_pagesize_hint=262144
    #*._use_realfree_heap=TRUE
    *.audit_file_dest='/u01/app/oracle/admin/DWH/adump'
    *.audit_trail='db'
    *.cluster_database=true
    *.compatible='11.2.0.0.0'
    *.control_files='/dborafiles/mdm_bn/dwh/oradata01/DWH/control01.ctl','/dborafiles/mdm_bn/dwh/orareco/DWH/control02.ctl'
    *.db_block_size=8192
    *.db_domain=''
    *.db_name='DWH'
    *.db_recovery_file_dest='/dborafiles/mdm_bn/dwh/orareco'
    *.db_recovery_file_dest_size=7373586432
    *.diagnostic_dest='/u01/app/oracle'
    *.dispatchers='(PROTOCOL=TCP) (SERVICE=DWH1XDB)'
    DWH12.instance_number=2
    DWH11.instance_number=1
    DWH11.local_listener='(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=bnl11237dat01-vip)(PORT=1521))))'
    DWH12.local_listener='(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=bnl11237dat02-vip)(PORT=1521))))'
    *.log_archive_dest_1='LOCATION=/dborafiles/mdm_bn/dwh/oraarch'
    *.log_archive_format='DWH_%t_%s_%r.arc'
    #*.memory_max_target=7226785792
    *.memory_target=7226785792
    *.open_cursors=1000
    *.processes=500
    *.remote_listener='LISTENERS_SCAN'
    *.remote_login_passwordfile='exclusive'
    *.sessions=555
    DWH12.thread=2
    DWH11.thread=1
    DWH12.undo_tablespace='UNDOTBS2'
    DWH11.undo_tablespace='UNDOTBS1'
    SPFILE='/dborafiles/mdm_bn/dwh/oradata01/DWH/spfileDWH1.ora' # line added by Agent
    [oracle@bnl11237dat01][DWH11]$ cat /etc/sysctl.conf
    # Kernel sysctl configuration file for Red Hat Linux
    # For binary values, 0 is disabled, 1 is enabled. See sysctl(8) and
    # sysctl.conf(5) for more details.
    # Controls IP packet forwarding
    net.ipv4.ip_forward = 0
    # Controls source route verification
    net.ipv4.conf.default.rp_filter = 1
    # Do not accept source routing
    net.ipv4.conf.default.accept_source_route = 0
    # Controls the System Request debugging functionality of the kernel
    kernel.sysrq = 0
    # Controls whether core dumps will append the PID to the core filename
    # Useful for debugging multi-threaded applications
    kernel.core_uses_pid = 1
    # Controls the use of TCP syncookies
    net.ipv4.tcp_syncookies = 1
    # Controls the maximum size of a message, in bytes
    kernel.msgmnb = 65536
    # Controls the default maxmimum size of a mesage queue
    kernel.msgmax = 65536
    # Controls the maximum shared segment size, in bytes
    kernel.shmmax = 68719476736
    # Controls the maximum number of shared memory segments, in pages
    #kernel.shmall = 4294967296
    kernel.shmall = 8250344
    # Oracle kernel parameters
    fs.aio-max-nr = 1048576
    fs.file-max = 6815744
    kernel.shmmni = 4096
    kernel.sem = 250 32000 100 128
    kernel.shmmax = 536870912
    net.ipv4.ip_local_port_range = 9000 65500
    net.core.rmem_default = 262144
    net.core.rmem_max = 4194304
    net.core.wmem_default = 262144
    net.core.wmem_max = 1048586
    net.ipv4.tcp_wmem = 262144 262144 262144
    net.ipv4.tcp_rmem = 4194304 4194304 4194304
    Please can I know how to resolve this error.

    CAUSE: User specified one or more of { db_cache_size , db_recycle_cache_size, db_keep_cache_size, db_nk_cache_size (where n is one of 2,4,8,16,32) } AND use_indirect_data_buffers is set to TRUE. This is illegal.
    ACTION: Very Large Memory can only be enabled with the old (pre-Oracle_8.2) parameters

  • "Fatal error: Allowed memory size of 33554432 bytes exhausted" I get this error message when I try to load very large threads at a debate site. What to do?

    "Fatal error: Allowed memory size of 33554432 bytes exhausted"
    I get this error message whenever I try to access very large threads at my favorite debate site using Firefox vs. 4 or 5 on my desktop or laptop computers. I do not get the error using IE8.
    The only fixes I have been able to find are for servers that have Wordpress and php.ini files.

    It works, thanks

  • Maximmum value of a sequnec and viewing very large numbers

    Hi,
    We have a table which has a column populated using a sequence - it is maintained by code in owb but growing at larger rate than expect
    e.g sequence starts at say 10000000000, would expect next row to b created with 10000000001 but large gap in between.
    Have a separate ticket raised with oracel for this as mainatined via dimesnion opertaor code.
    However, have couple of questions reagrding sequences.
    1) How large can they be
    2) If we try query numbers which over 14 digits long starts to show e in the tool we are using (pl/sql developer) Is there a way to ensure we can see the whole number (sqlplus?) and
    I'm assuming say if had number column which very large say 18 digits long joined to another table 18 digits long then no issues?
    Thanks

    1. you really should read docs.
    e.g sequence starts at say 10000000000, would expect next row to b created with 10000000001 but large gap in between.Not necessarily. Oracle Sequence guarantees uniqueness, not continuity.
    There can be gaps by design. Especially if CACHE of the sequence is big.
    Have a separate ticket raised with oracel for this as mainatined via dimesnion opertaor code.you really should read docs before creating tickets.
    However, have couple of questions reagrding sequences.
    1) How large can they be-As large as NUMBER datatype can be - 38 digits.-
    I am wrong. Doc says - 28 digits.
    2) If we try query numbers which over 14 digits long starts to show e in the tool we are using (pl/sql developer) Is there a way to ensure we can see the whole number (sqlplus?) What is printed on screen is a matter of formatting. You can choose formatting you like in SQLPlus and SQLDeveloper
    I'm assuming say if had number column which very large say 18 digits long joined to another table 18 digits long then no issues?No issues.
    Edited by: Mark Malakanov (user11181920) on Apr 10, 2013 2:06 PM

  • Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    I think more likely, it's a memory / scratch disk issue.  1.25 gigabytes is a very big image file!!
    Nancy O.

  • Memory consumption for an ASO app with a very large outline

    Solaris 10 running Essbase 9.3.1.3.08 64-bit
    We have an app with a very large outline (3Gb). When we do a dimension restructure, it takes about 30 minutes which I suppose is reasonable. However, it takes up about 13Gb RAM to do this and doesn't release the memory when finished unless we restart the app.
    Is this normal? It seems an incredible amount of RAM to use, even for such a large outline and it doesn't release it when done.
    The box has 32Gb RAM.

    I think it was version 9.3.1.3 that outline compaction started working. The first thing I would try is to compact the outline. The MaxL statement for it is Alter database xxxx.yyyy compact outline;
    ASO Outlines tend to grow with every change, do you really have that many members and attributes that would make that big an outline. If it keeps growing at some point it will corrupt

  • Log and transfer, my transfered files become very large...

    When I use Log and transfer, my files become very large. For instance, The card that I'm transferring from only holds 32G and after I transfer the footage to an external drive it has become around 70G. I checked the codec on transfered files and its showing apple pro res 422 1920 x 1080. Is there an import setting I'm missing? This is all new to me...

    This is normal.
    Your camera likely shoots AVCHD, a highly compressed version of MPEG 4.
    It is so compressed because it uses a "Long GOP" format where only every 15th frame is independent (i frame).
    Those in between only contain the differences to the prior i frame and cannot exist without it.
    (It's actually somewhat more complicated, but now you know what it is called, you can do more research)
    You need the ability to edit on any frame, however. The footage is transcoded to make every frame independent so you can do just that. It takes more space.

  • Very Large Spacing Issue In Both Mail And Address Book!!

    Hello,
    I need your help to resolve this problem. Initially I had a problem with garbled font in my Mail application. I used FontNuke in an attempt to resolve the problem. The result of using FontNuke was that the font was un-garbled but now I have very large spacing in the Mail and Address Book applications. Also the circles on the left side of the Mail application (that normally indicate the number of unread messages you have) are also very large!! the same thing is happens with the red star on the Mail icon that indicates the number of unread messages you have.
    I would really appreciate any help that you can provide!
    Thanks,
    SydCam

    Thanks for following up so quickly. I took your advice and checked my fonts. I opened Font book and got rid of all the "duplicate" fonts and made sure that I had all of the required fonts. I then rebooted and when I went back to Mail and the Address Book, I still had the same problem. There are still very large spaces in my email both in the header and body. The circle to the left that indicates how many un-read email you have is still very large. It is over sized. I wish there was a way I could show you a screen capture so you can see what exactly what I'm referring to.
    Obviously there is something that I'm missing, but I just don't know what it is. I would greatly appreciate any help that you (or anyone else) can provide!
    Thanks in advance!
    SydCam

Maybe you are looking for

  • I have a HP Officejet Pro 8500 wireless. Does it support "air print'?

    I just purchased an iPad2.  I have setup a wireless network in my home. My laptop, printer and Blu-ray are attached to this. The iPad2 said the wireless printer must be "air print" enabled.  I can not find anywhere there is a mention of "air print" o

  • External display crash

    Hi, I have the Unibody MacBook from late 2008. My problem is that when I connect an external display and using extended desktop, the external display crashes randomly. Sometimes I can watch movies or photos for hours and sometimes it crashes after 3

  • Not displaying ship to party for second line item

    Hi All, When I am creating Sales order, I entered sold to pary,PO number,PO date,material (first line item)and qty.In partner functions for first line item it is displaying ship to party.But whaen I entered second line item & qty,ship to party is not

  • I want an app but is in another store

    I want an app but it is in another store how can I change de store.

  • Cannot import SIM contacts to iPhone 3S

    Hiya Trying to import contacts to iPhone 3S from SIM but nothing happens qfter the 'Importing Contacts' sign appears. Others post the same issue but I can't find a fix that works, thanks fpr any help, joe