Inconsistent frame and DAQ rate

Hi,
I have written, LabVIEW 7.0, a code (attached) to read from a DAQ card, 3 Visa's and 2 Firewire cameras. Before I included the cameras, everything worked fine. With the cameras, the performance decreases with time (the longer the acquisition time is, the lesser the frame rate). My questions:
1. How can I optimize the code more, to give me the highest possible frame rate?
2. Is my approach in getting capturing the images an "OK" way, or is there a better way in doing it? This is my first time 'playing' around with IMAQ 1394.
3. I have a hunch that if I save every images once it is in the buffer, instead of capture everything first and save later, would increase the performance. Can this be true?
4. Last, is there a quick and ea
sy way to produce an AVI file from the recorded images? I realize that the elapsed time between the images is different and this cause an additional headache in producing the AVI file.
By the way, the desktop I am using is a P4 2.4 GHz, with 1024 MB of RAM (if I'm not mistaken). I hope what I said make sense here and I highly appreciate all the help I could get Thanks!
Shazlan
Attachments:
NIPost.llb ‏539 KB

Shazlan,
The first thing you can do to optimize your code would be to use only one IMAQ create. In your code the IMAQ create is inside the loop so you will be using more and more memory as the program runs. I have attach an example with 1394 that I wrote. It has only one IMAQ create. The example also has a implemented a save to AVI for saving the images. I hope this help
A. Talley
Applications Engineering
National Instruments
Attachments:
Grab1394_and_Save_to_AVI.vi ‏102 KB

Similar Messages

  • Synchronisation of frame grabber and DAQ board

    I want to aquire images and analog voltage signals at the same time, and synchronised, using a NI PCI-1428 Camera Link frame grabber and a NI PCI-6014 DAQ board. Can this be done using the RTSI trigger functions? I know that the frame grabber supports this, but what about the DAQ board?
    How about synchronising a NI-board with a non-NI frame grabber?
    Thanks a lot!
    Peter

    Hi, Peter,
    No, the PCI-6014 DAQ board does not support RTSI. However, most of the low-cost E-series NI data acquisition cards do support RTSI, such as the PCI-6024E or PCI-6036E, so you might consider using one of those boards instead.
    If you decide to go that route, there is a useful tutorial that describes the signals that can be routed over RTSI for NI DAQ and IMAQ boards:
    Developer Zone Tutorial: Synchronizing Motion, Vision, and Data Acquisition
    as well as several example programs:
    Developer Zone Example: Integra
    ting IMAQ and DAQ with Single Display
    Developer Zone Example: Low-level Triggered Ring (with DAQ-supplied triggers)
    As for a non-NI frame grabber, that would really depend on what support the frame grabber has for timing and synchronization. You could route a trigger signal or scan clock out over the PFI pins on the PCI-6014, but you would need to see if the frame grabber is designed to receive these types of signals.
    I hope that helps!
    Best regards,
    Dawna P.
    Applications Engineer
    National Instruments

  • MSI GT70 Dominator Pro Inconsistent Frames

    I have MSI GT70 Dominator Pro and I noticed that there has been slight decrease in game performance after playing the game for 7-8 hours. I am currently playing Tomb Raider 2013 and before I started playing the game I did a overall performance test of my laptop with a built in benchmark.
    My Specs:
    Intel Core I7-4800mq
    Nividia Geforece GTX 880M 8GB
    1920*1080 LCD 17.3 inch Screen
    16 GB Memory
    1TB 7200RPM-128GB SSD
    Initial Benchmark before playing the game.
    Tomb Raider (2013) on Ultimate setting (Tressfx on)
    Min-34
    Max-68.0
    Avg-49.1
    Post Benchmark after playing for 7-8 hours
    Tomb Raider (2013) on Ultimate setting (Tressfx on)
    Min-28.2
    Max-58.0
    Avg-40.7
    Note: GPU and CPU same temperature on both initial benchmark and post benchmark. GPU (max-76C) and CPU (Max (55C). Same background programs and same video card drivers and same game version. I would like to know why there has been decrease in performance even though the factors are the same.

    Welcome to the MSI club buddy! ....there have been quite a number or threads about inconsistent/low frame rates. I too own an MSI GT60 with GTX870M and seeing inconsistent frame rates. Hope some vbios update fixes this

  • A problem with delays in timed loops and DAQ

    I am programming a simulation for nuclear rewetting for a visitor centre at my company in Switzerland. It involves heating a "fuel rod" and then filling the chamber with water. The pump automatically starts once the rod core reaches 750C. After this, a requirement stipulates that flow rate be checked to ensure the pump is operating at the necessary conditions. If it isn't, the heater must be shutdown to avoid, well... meltdown. However, we must allow 10 seconds for the pump to respond, while still allowing a DAQ rate of 10-100Hz.
    The challenge is that I can't add a delay in my main loop else delay all acquisition, but I can't figure out how to trigger a peripheral loop (with DAQ for the single channel of checking flow) from the main loop, and when the peripheral loop determines if flow has initalised, respond back to the main loop with the okay.
    I think much of my confusion is in the interaction of the loops and the default feedback nodes that labview is putting in willy nilly. Would the only solution be to have two 'main' loops that don't communicate with eachother but rather do the same thing while operating on different timing? Tell me if you want me to post the file (although its on an unnetworked computer and I didn't think it would be too useful).
    Thanks+ Curran
    Solved!
    Go to Solution.

    Here it is! It is not in any form of completion unfortunately.
    So reading in the temp with NI9213 and watercolumn height with NI9215, we determine to turn on the pump with NI9472. NI9421 determines whether the pump is on (there is flow) and I must respond accordingly.
    I have 3 scenarios similar to this one as well, so having redundant loops with different timing like I mentioned would be way to heavy. I think I may have though up of a solution? At the time the pump is initiated, we record the iteration and wait for a number of iterations that correspond to 10s to pass before fulfilling the pump shutoff requirement?
    Attachments:
    rewettin1.vi ‏15 KB

  • MyRIO memory, data transfer and clock rate

    Hi
    I am trying to do some computations on a previously obtained file sampled at 100Msps using myRIO module. I have some doubts regarding the same. There are mainly two doubts, one regarding data transfer and other regarding clock rate. 
    1. Currently, I access my file (size 50 MB) from my development computer hard drive in FPGA through DMA FIFO, taking one block consisting of around 5500 points at a time. I have been running the VI in emulation mode for the time being. I was able to transfer through DMA from host, but it is very slow (i can see each point being transferred!!). The timer connected in while loop in FPGA says 2 ticks for each loop, but the data transfer is taking long. There could be two reasons for this, one being that the serial cable used is the problem, the DMA happens fast but the update as seen to the user is slower, the second being that the timer is not recording the time for data trasfer. Which one could be the reason?
    If I put the file in the myRIO module, I will have to compile it each and every time, but does it behave the same way as I did before with dev PC(will the DMA transfer be faster)? And here too, do I need to put the file in the USB stick? My MAX says that there is 293 MB of primary disk free space in the module. I am not able to see this space at all. If I put my file in this memory, will the data transfer be faster? That is, can I use any static memory in the board (>50MB) to put my file? or can I use any data transfer method other than FIFO? This forum (http://forums.ni.com/t5/Academic-Hardware-Products-ELVIS/myRIO-Compile-Error/td-p/2709721/highlight/... discusses this issue, but I would like to know the speed of the transfer too. 
    2. The data in the file is sampled at 100Msps. The filter blocks inside FPGA ask to specify the FPGA clock rate and sampling rate, i created a 200MHz derived clock and mentioned the same, gave sampling rate as 100Msps, but the filter is giving zero results. Do these blocks work with derived clock rates? or is it the property of SCTL alone?
    Thanks a lot
    Arya

    Hi Sam
    Thanks for the quick reply. I will keep the terminology in mind. I am trying analyse the data file (each of the 5500 samples corresponds to a single frame of data)  by doing some intensive signal processing algorithms on each frame, then average the results and disply it.
    I tried putting the file on the RT target, both using a USB stick and using the RT target internal memory. I thought I will write back the delay time for each loop after the transfer has occured completely, to a text tile in the system. I ran the code my making an exe for both the USB stick and RT target internal memory methods; and compiling using the FPGA emulater in the dev PC VI. (A screenshot of the last method is attached, the same is used for both the other methods with minor modifications. )To my surprise, all three of them gave 13 ms as the delay. I certainly expect the transfer from RT internal memory faster than USB and the one from the dev PC to be the slowest. I will work more on the same and try to figure out why this is happening so.
    When I transferred the data file (50MB) into the RT flash memory, the MAX shows 50MB decrease in the free physical memory but only 20MB decrease in the primary disk free space. Why is this so? Could you please tell me the differences between them? I did not get any useful online resources when I searched.
    Meanwhile, the other doubt still persists, is it possible to run filter blocks with the derived clock rates? Can we specify clock rates like 200MHz and sampling rates like 100Msps in the filter configuration window? I tried, but obtained zero results.
    Thanks and regards
    Arya
    Attachments:
    Dev PC VI.PNG ‏33 KB
    FPGA VI.PNG ‏16 KB
    Delay text file.PNG ‏4 KB

  • What are supported sampling frequency and digitization rates for Zen Micropho

    Some MP3's are playing back slowly. It is inconsistent, though, since files with the same digitization rate and sampling frequency will behave differently - some right speed, some slow.
    The bulletin board says:
    "My tracks don't play at the correct speed e.g. they play too slowly, why?
    Chances are they are encoded in an unsupported sampling frequency..."
    How do I find out what the supported sampling fequencies and digitization rates are?

    Thanks for the info. I guess I was looking for something more specific - the exact bitrates and sample rates that Creative claims to support. Would you know where official and comprehensi've data can be had? There must be a tech spec somewhere.
    It is common these days in business to see a recording of, say, a conference call or seminar presentation at 32k bitrate/025Hz, or even 24k bitrate/8000Hz, posted to a company's website for download by those who could not be there, and MP3 players are increasingly used for their replay. Companies use low digitization rates because there is no need for hifi and the files are much smaller: less storage, faster download.
    I'd be surprized to think that Creative don't have compatibility with the standard range of rates offered by ubiquitous programs like Audacity and dBpower, the latter being one they themselves recommend!

  • Drop Frame and Non Drop Frame

    Does QuickTime Pro 7 allow you to select between drop frame and non-drop frame?

    I don't understand your question.
    QuickTime Player will automatically reduce the implied frame rate to maintain the file playing. You can observe this by viewing any HD movie trailer on a non supported OS.
    Frame rate drops so much that the file is nearly not viewable yet the audio portion continues to play back without issue.
    The QuickTime author can't set these parameters and QuickTime Pro users can override the issue by selecting to "Play All Frames". This can quickly bring playback to its knees on older systems but it will not drop any frame during playback. It may take two days to view each frame of a 1080 HD movie trailer on a G3 but you should see each one.

  • Need OID for rxload and input rate

    Hi all,
    I want to monitor the rxload and input rate on interfaces but I can't find the correct OID can someone help me with this please.
    Can someone also provideme a good document for this
    GigabitEthernet4/8 is up, line protocol is up (connected)
      Hardware is Gigabit Ethernet Port, address is 000d.657d.3387 (bia 000d.657d.3387)
      Description: --- 2924_MAIN ---
      MTU 1500 bytes, BW 100000 Kbit, DLY 10 usec,
         reliability 255/255, txload 1/255, rxload 1/255
      Encapsulation ARPA, loopback not set
      Keepalive set (10 sec)
      Half-duplex, 100Mb/s, link type is auto, media type is 10/100/1000-TX
      input flow-control is off, output flow-control is off
      ARP type: ARPA, ARP Timeout 04:00:00
      Last input 00:00:47, output never, output hang never
      Last clearing of "show interface" counters 20w1d
      Input queue: 0/2000/0/0 (size/max/drops/flushes); Total output drops: 0
      Queueing strategy: fifo
      Output queue: 0/40 (size/max)
      5 minute input rate 5000 bits/sec, 4 packets/sec
      5 minute output rate 153000 bits/sec, 26 packets/sec
         143271178 packets input, 34567505950 bytes, 0 no buffer
         Received 2731549 broadcasts (295416 multicast)
         0 runts, 0 giants, 0 throttles
         0 input errors, 0 CRC, 0 frame, 0 overrun, 0 ignored
         0 input packets with dribble condition detected
         253476590 packets output, 117220536494 bytes, 0 underruns
         1341644 output errors, 542074 collisions, 0 interface resets
         0 babbles, 259071 late collision, 0 deferred
         0 lost carrier, 0 no carrier
         0 output buffer failures, 0 output buffers swapped out

    Hi,
         I haven't been able to find any documentation on CCo regarding this, but in my searching, I have come across the following MIb Object, which will tell you the input rate on the interface :
    .1.3.6.1.4.1.9.2.2.1.1.6
    locIfInBitsSec OBJECT-TYPE
        -- FROM     OLD-CISCO-INTERFACES-MIB
        SYNTAX          Integer
        MAX-ACCESS     read-only
        STATUS          Mandatory
        DESCRIPTION    "Five minute exponentially-decayed moving                            average of input bits per second."
         There is a similar object  locIfOutBitsSec (.1.3.6.1.4.1.9.2.2.1.1.8) for the output rate.
         To so rxload = 255 * locIfInBitsSec / (1000000 * ifHighSpeed )
    Regards
    Derek

  • How to resolve audio sampling and bit rate differences before sound mix

    I'm editing a project where an additional recorder (Zoom H4n) was used and synched to the camcorder track with Plural Eyes.  Some of the sampling and bit rates don't match within the Plural Eyes synchronized clips and I need to send this out for a sound mix.  Could I use Compressor to transcode the audio? Would I have to go back to my original separate tracks, transcode in Compressor and then re-synch using Plural Eyes?  Next time ProRes from the beginning -
    but if there's an easier fix for my current problem it would be much appreciated.

    Thanks Michael -- you've helped me with the project before (back in May of this year). 
    I started this project  as a newby back in 2010 and was a bit overwhelmed (i.e. just happy to get everything into FCP and be able to edit.)
    I'll try the media manage solution, but if that doesn't work I think I'll live with the one-frame drift.  I'm assuming
    the only other alternative would be to go back to all the original video and audio files; transcode using Compressor, and then re-synch with Plural Eyes and go back in to match up the edits?
    Thanks to your continuing help, I should know how to set things up correctly from the get go for the next one!

  • Spectrum analyzer with sim and DAQ IV File

    Hello..
    i have a project that i need to design a Spectrum analyzer with simulation mode and DAQ option. ( so you can generate input signal from LabView and DAQ.
    all i look for is a modern Spectrum anlyzer VI file for reference and education purposes
    i know there are examples out there in LabView but, i need a modern one with a lot of options and features ( like Impulse respond, marker... filters.....)
    there is something i found here on the website, it happen long time ago, and some proffessor from Vietnam created such project. its not with DAQ, but its great for me.
    here is the link:
    http://digital.ni.com/worldwide/singapore.nsf/web/all/42B390E4624228D486257249001A5349
    this kind of Spectrum anlyzer looks great
    most appreciate for any help

    Hi Eldad,
    Thanks for posting and welcome to the NI forums!
    The
    application you have linked uses LabVIEW to control a Tektronix TDS220
    over RS232 (a.k.a. serial port).  The TDS220 has a 1GHz sample rate
    with a 100 MHz bandwidth--the specs on our DAQ devices don't go nearly
    this high, our High Speed Digitizers
    would be a better option if you need something comparable.  Also, the
    programming of the two devices (TDS220 vs. DAQ) is going to be quite
    different, so I am not sure how helpful the code from the link would be
    for your application.  I don't have access to the code or permission to
    post it here, but you could always try contacting the original author
    (his contact info is in the link you provided, not sure if it's up to
    date or not).  Again, I'm not really sure how helpful the code would be
    for you.
    You might want to take a look at the following example to see if it does what you need:
    NI-DAQmx: Benchtop Spectrum Analyzer
    There
    are also numerous examples that are installed with the DAQmx Driver to
    help you get started with the API.  You can access them from LabVIEW by
    going to:
    Help >> Find Examples... >>  Hardware Input and Output >> DAQmx
    There
    may not be a ready-made example that will do everything that you
    require, but the above should give you a good starting point for
    implementing the necessary features yourself.  If you run into any
    specific issues while trying to program, please don't hesitate to post
    to the forum and we'll be more than happy to help out.
    -John
    John Passiak

  • FPGA and DAQ card synchroniz​ation

    Hi, we are controlling and acquiring data from multiple hardware devices (including translational stages and photodetectors). Until last week, we used to peform all control and acquisition using a PCIe 7852R FPGA board. However, we decided to switch the acquisition part to a PCIe 6363 DAQ card to improve the voltage resolution. During testing, I found that the internal clocks in the FPGA and the DAQ cards are slightly mismatched (not just a phase delay, but a difference in time-period).
    I know this because I generated a square wave (period = 20us) using the FPGA and acquired it using the DAQ card (at a rate of 200kHz, i.e., 1 sample every 5us). I observed that the acquired square shifts by 5us every 5 seconds or so. Such a shift does not occur if the generation and acquisition is done using the same board. Therefore, the only explanation is that the clock frequencies of FPGA and DAQ cards differ. According to my calculation, percentage difference between their clock times must be 5us/5s = 0.0001%. 
    Therefore, I am wondering if there is anyway to synchronise the clocks between them. Or, is there a way I can drive the DAQ device based on the FPGA clock, or vice versa? Also, please let me know if there is something trivial that I have fix.
    Thank you very much.
    Regards,
    Varun
    Solved!
    Go to Solution.

    Hi GerdW,
    Thank you for your reply. 
    I understand both solutions you have suggested. I had conceptually thought about the first one (to control the sampling rate). However, I still haven't figured out how to accurately generate a 200kHz square wave - I think I can figure that one out this morning.
    However, I am unsure how to implement the second option. I presume the default internal clock inside the DAQ is running at 100MHz or so. Therefore, inorder to obtain the same performance while using an external clock, I would have to generate at least a 80MHz TTL clock-out from the FPGA. Could you please tell me how I can do that? I couldn't find any clock-out capabilities on my FPGA. Or would I have to generate a clock of my own using Single-Cycle-Timed-Loops?
    Not only did you provide two different solutions, your reply suggested that I wasn't wrong in interpreting that the clock periods of our FPGA and DAQ must be slightly off. 
    Thank you very much.
    Regards,
    Varun

  • How can i set the Daq rate on the compactDAQ NI 9237?

    I am using the  NI 9237 module in the CompactDAQ 9172 chassis, with a single channel.  I am using LabVIEW 8.2. I cannot find any software where I can set the DAQ rate on the module. The "daq assistant" in LabVIEW, the "Measurement and Automation explorer test panel and the example VI's all seem stuck at a rate of 5000 samples/sec. I only need 32 samples/sec.  The software offers an input variable for "rate" but does not respond.  Note that in the"Measurment and automation explorer test panel seems to allow higher rates but not lower.  Am I missing something?   My application has been crashing from time to time for not being able the retreive the data fast enough, I thought to minimize the rate to lower the transfer load on the operating system.

    Hello Alfonso,
    It sounds like you might be getting errors -200279 and -200278.  (In the future, if you post the actual error codes, it helps us to know exactly what is happening).  Error -200279 happens when you are performing a hardware-timed acquisition (meaning the data is sampled according to a clock signal on your board), but your LabVIEW program is not reading the values from the buffer allocated for that task in computer memory fast enough.  Basically it's a buffer overflow error.  It means older samples have been overwritten before you attempted to read them out.  As the error message suggests, "increasing the buffer size, reading the data more frequently, or specifying a fixed number of samples to read instead of reading all available samples might correct the problem."  For more information on this error, please see the KB (DAQmx) Error -200279 During a Continuous, Buffered Acquisition.
    Error -200278 happens most often when you have configured a finite acquisition, but are calling the DAQmx Read function in a loop.  If you want to perform a finite acquisition, you should only call DAQmx Read once.  For more information on this error, see the KB Error -200278 at DAQmx Read.
    Finally, please refer to Abhinav's earlier post about the sample rate on the 9237 module.  As he described, the NI-DAQmx 8.3 driver will only allow you to set the sample clock to integer divisions of 50k (50,000/n, where n can be 1, 2, 3...13).  Since the maximum divisor is 13, the smallest sample rate that can be used is 3.846 kS/s.  You can check what value the driver is actually using for the sample clock by reading from the SampClk.Rate property of the DAQmx Timing property node.
    I hope this helps!  Let me know if you have any questions about what I've described.
    Best regards,

  • Independen​t Sample and log rates

    I would like to log data at 0.5 second intervals while the daq assistant acquires data at 1KHz,
    I tried using the write to measurement file express vi but couldn't separate the sampling and logging rates.
    Any suggestions with code is always appreciated.
    Thanks !!
    Attachments:
    logger rates.vi ‏87 KB

    Your VI is saved in LabvIEW 2013, hence I can't open, either save it in previous version and upload or share the snapshot of block diagram.
    Well, have you tried running two parallel loops one for acquisition and other for data logging.
    Refer to these:
    1. Application Design Patterns: Producer/Consumer
    2. Data Acquisition Reference Design for LabVIEW
    I am not allergic to Kudos, in fact I love Kudos.
     Make your LabVIEW experience more CONVENIENT.

  • Simultaneous Video and DAQ Start

    Pardon my ignorance in the signals world, but I'm having a hard time trying to simultaneously start a video camera and DAQ hardware.  Hopefully someone can provide some insight.
    I will be using a standard camera that accepts triggering and a cDAQ chassis.  I know that I need to use DAQmx for the DAQ (I have a lot of experience with this), but I've never done triggering before in DAQmx.  How do I perform triggering with DAQmx to send the camera a TTL signal to start taking video?  What hardware will I need to send this signal?  Thanks in advance.
    Nathan - Certified LabVIEW Developer

    Nathan,
    What kind of camera bus are you using; USB, FireWire or GigE? Are you able to acquire images with your camera in Measurement & Automation Explorer (MAX)?  Here is a link to a community example that Grabs and Saves to AVI using IMAQdx.  If the camera has digital i/o lines to accept the TTL signal pulse you can run the example I've linked to and configure it for a triggered acquisition in MAX.  The camera's user manual should specify which pins are trigger lines or whether a specific i/o cable are terminal block is necessary.  If the camera supports triggered acquisition you can go into MAX select the camera under Devices and Interfaces the select it's Camera Attributes Tab.  There should be an attibute called "trigger mode" and its default setting is off so you would need to turn it on.  If there are multiple trigger lines, you would need to specify the "trigger source" under its allotted attribute.  Some cameras can only trigger the start of an acquisition while others can also trigger each frame acquisition.  You can specify whichever you'd like under the "Trigger Selector" attribute.
    Once the camera is configured to trigger I would have it perform its image acqusiition in it's own loop as in the example I linked then have the DAQmx tasks perform the data acquisition in a separate loop.  Then all you would need to do is synchronize your data acquisition to a single pulse generation task and wire the pulse output to your trigger line.  If you wish to control the acquisition of each frame you would need to generate a pulse train to the desired frequency for you application, however you would need to be cautious of the cameras limitations on how many frames per second it can acquire at.
    Regards,
    Isaac S.
    Applications Engineer
    National Instruments

  • I am trying to create a simple animated gif in Photoshop. I've set up my frames and want to use the tween to make the transitions less jerky. When I tween between frame 1 and frame 2 the object in frame two goes out of position, appearing in a different p

    I am trying to create a simple animated gif in Photoshop. I've set up my frames and want to use the tween to make the transitions less jerky. When I tween between frame 1 and frame 2 the object in frame two goes out of position, appearing in a different place than where it is on frame 2. Confused!

    Hi Melissa - thanks for your interest. Here's the first frame, the second frame and the tween frame. I don't understand why the tween is changing the position of the object in frame 2, was expecting it to just fade from one frame to the next.

Maybe you are looking for

  • Connecting a 2009 Macbook pro to a 5k iMac as a second monitor???

    Hey all, I just got a new 5k iMac and I want to use my older (2010) Macbook Pro as a second monitor. How do I do this and what would I need to buy (cord wise) to make that happen? Thanks

  • How to Import Custom Scale using Field Piont I/O

    I need to READ   a 0 to 100 Newton strain gauge on graph chart.  It is a full bridge with an excitation Voltage of 10V.   The field point Input module FP-SG-140 is configured for a range of -3.9... to +3.9 mV/V I know how to create a custom scale usi

  • AdvancedDataGrid view Header color and text

    Hi, I want to change the color of datagrid header as well as change the color of header text(only) not entire column. i dont know how to do this .please help me boz i m new in flex.

  • CE font in Adobe Story (ISO-8859-2)

    I have a problem with polish diacritic signs. There is a chance to put a ISO-8859-2 standard? Bsides it's funny thing I can use polish font in Character BIO name but not in Script? Do somethig with that. Please:)

  • Need a USER EXIT

    Hi friends, When we create a debit/credit note in DPS activity , the assignment of a partner Vendor should be mandatory for the sales area is 1678-02-01,Sales document types are CR, DR, ZRE, ZCOR, ZDOR. since it is going to effect globally Need USER