Real Time Arranger Functions?

Can Logic or Main Stage function as a real time arranger and furnish backing tracks on the fly? Is there a plug-in to do this?
I currently use a Roland RA-800 arranger module to create styles and harmonies based on the notes that I am playing on a simple MIDI keyboard. It does this in real time for live performance. This arranger module is basically the electronic circuits from an arranger keyboard (like a Roland G-70) but does not have an actual keyboard.
I would like to replace the dedicated arranger module with a laptop and software. Any suggestions?

wrath_of_con wrote:
I do not have a while loop around that VI (maybe this is why the read/write doesnt work, because it only runs once?)
You definitely need to wrap the code running on your FPGA in a While loop so that it continues running. Commonly we set the loop to run indefinitiely (wire a constant to the Loop termination terminal) and then control the operation of the VI within the loop using the front panel controls.
But regarding clusters and Read/Write... from the LabVIEW help...
"You can write to a whole cluster, but you cannot write to individual elements of a cluster."
In our application I will definitely need tyo write to individual elements of a cluster. One basic thing I am doing is controlling a stepper motor driver, giving it pulses, direction, etc. I have been able to make it work using the Call VI and then Bundling by Name whatever cluster I want access to.
To update individual elements of a cluster in your FPGA VI, you need to create individual controls on the front panel of the FPGA VI. Add the updated data to the cluster that you are maintaining within your VI. In the new While loop, create a shift register and use it to store the cluster as a state variable of your VI. Then whenever you need to update nay values in the cluster use the Bundle by Name function as you described.
Christian Loew, CLA
Principal Systems Engineer, National Instruments
Please tip your answer providers with kudos.
Any attached Code is provided As Is. It has not been tested or validated as a product, for use in a deployed application or system,
or for use in hazardous environments. You assume all risks for use of the Code and use of the Code is subject
to the Sample Code License Terms which can be found at: http://ni.com/samplecodelicense

Similar Messages

  • Real time business functionality using SAP Xi and hardware included????

    Hi All,
    I am quite new to sap-xi. i have practised some of the scenarios that can be done with xi.
    I was just trying to visualise some of the real time business requirements where SAP XI is used.
    For example
    Some may have used Sap XI to transform data between file server and SAP system or may a web serve asking for details from sap system to display datas or some times sap xi systems sits betweem an integration server (like biztalk etc) and sap system.
    Can anyone of you can give me some of the real time requirements you have dealt with over the years
    in terms of business functionality and servers involved in the whole system?
    Thanks
    Rakesh

    Hi
    Various scenario's:
    FILE-FILE
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/srinivas.vanamala2/blog/2007/02/05/step-by-step-guide-xml-file-2-xml-file-scenario-part-i
    XML to PDF
    /people/divya.vidyanandanprabhu/blog/2005/06/28/converting-xml-to-pdf-using-xi
    Work with BPM
    /people/krishna.moorthyp/blog/2005/06/09/walkthrough-with-bpm
    File to IDOC
    /people/venugopalarao.immadisetty/blog/2007/01/24/troubleshooting-file-to-idoc-scenario-in-xi
    E-mail alerts in XI-CCMS
    /people/aravindh.prasanna/blog/2005/12/23/configuring-scenario-specific-e-mail-alerts-in-xi-ccms-part--1
    RFC using BPM
    /people/arpit.seth/blog/2005/06/27/rfc-scenario-using-bpm--starter-kit
    HTTP to RFC
    /people/community.user/blog/2006/12/12/http-to-rfc--a-starter-kit
    IDOC to JMS
    /people/saravanakumar.kuppusamy2/blog/2007/09/26/an-approach-to-handle-total-record-counts-in-idoc-to-jms-scenarios
    /people/sudheer.babu2/blog/2007/01/18/asyncsync-communication-using-jms-adapter-without-bpm-sp-19
    Pls : Reward points if helpful
    Thanks
    Vishal

  • Configuration of Real time collaboration(IM)

    Hi,
    I would like to configure my portal for Instant messaging between portal user. I mean if users are online then they can chat (Some what like Yahoo). I need help from scratch. How to go with it and what i need to configure for this.
    Its Urgent!!!.
    Thanks & Regards
    Parth

    Working Collaboratively Across Teams – Scenario Installation Guide
    1     Purpose
    The purpose of this scenario is to provide an example of how SAP Enterprise Portal can enable your teams to work collaboratively – within the same functional team, or across functional teams with the use of the Collaboration Launch Pad (CLP).
    Following you will find the information on how you can install this scenario. The Scenario Installation Guide provides you with the information you need to set up your business scenario configuration.
    Note that the installation documentation for SAP Best Practices business scenarios is structured on two levels:
    •     Scenario Installation Guide: one document for a business scenario
    •     Building Block Configuration Guide: one document for a building block, but several documents for a scenario installation
    The Scenario Installation Guide is the backbone of the installation documentation and describes which building blocks you need to install for this scenario and prescribes the order you have to follow when you install them.
    For the installation of each building block, you have to refer to the corresponding Building Block Configuration Guide where you can find detailed configuration procedures of this building block.
    Always use the Scenario Installation Guide (this document) as the starting point for all your installation activities.
    For a successful installation of the entire scenario, it is important:
    •     that you follow the structure in the Scenario Installation Guide from the first to the last installation activity
    •     that you carry out all installation activities of a building block before you start with the next building block.
    2     Prerequisites
    •     You have a live SAP R/3 backend system.
    •     SAP GUI installed on your machine to use Win GUI iView.
    •     You have read the essential document, Building Block N00: Essential Information.
    •     You have the End User role (eu_role) assigned to your user.
    Collaboration Launch Pad: Activate Link:
    The Collaboration Launch Pad (CLP) allows portal users to manage their contacts and launch services for collaboration. Users click a link in the tool area of the portal header to launch the CLP.
    If you want to display the link to launch the CLP in the portal header after installing the Collaboration component, you must activate it manually in the attributes for the tool area of the default framework page.
    Procedure
    1.     Open the portal in a browser and go to:
    SAP Netweaver Portal     Content Administration ® Portal Content ®  Portal Users ® Standard Portal Users ® Default Framework Page
    2.     Double-click Default Framework Page.
    3.     Select the Tool Area iView checkbox.
    4.     Choose Open. The Property Editor page displays.
    5.     Choose Show All in the Property Category drop list menu.
    6.     Choose Enable Collaboration Lauch Pad properties. 
    7.     Select Yes radio button.
    8.     Choose Save
    9.     Choose Close.
    10.     Choose F5 to refresh your portal screen.
    3.3     Collaboration Launch Pad: Making Services Available
    SAP delivers a default configuration for making services available in the following menus:
    •     Collaboration menu of the Collaboration Launch Pad (CLP) and the room member list
    •     Context menu for the displayed user names
    Procedure
    11.     For the preconfigured groups of services (command groups) to be available for users, you must assign them to the respective purpose (type) provided.
    12.     Do the following steps and choose:
    SAP Netweaver Portal     Content Administration ® Collaboration Content ® Collaboration Launch Pad Administration ® Command Group.
    13.     Copy the existing command groups to your own namespace.
    14.     Use the Duplicate function and save each command group with a different name. In this way, you prevent the command groups used from being overwritten during the next system upgrade.
    15.     Make a note of the names( for example, clpGroup_2 and userGroup_2) that you have assigned for the command groups clpGroup and userGroup.
    16.     Choose Type in the Left hand navigation Topics Area: Topics.
    17.     Choose the clpType checkbox.
    18.     Choose the Edit button.
    19.     Choose the Command Group (for exampe, clpGroup_2) you saved for clpGroup in the contextmenu_commandgroup drop list menu.
    20.     Choose OK.
    21.     Choose the userType checkbox. 
    22.     Choose the Edit button.
    23.     Choose the Command Group(for example, userGroup_2) you saved for userGroup in the contextmenu_commandgroup drop list menu.
    24.     Choose OK.
    3.4     Activating the RTC Session Manager
    In order to be able to use the Real-Time Collaboration functions (sending instant messages or sharing applications), you must activate the RTC Session Manager after installing Collaboration.
    Procedure
    To activate the RTC Session Manager, perform the following steps:
    25.     Go to the default framework page.
    Enterprise Portal     Portal Content  ® Portal Users ® Standard Portal Users ® Default Framework Page.
    26.     Double-click Default Framework Page.
    27.     Select the Tool Area iView checkbox and choose Open.
    28.     Choose Show All in the Property Editor drop list menu.
    29.     Expand the Enable Real-Time Collaboration properties. 
    30.     Select Yes radio button.
    31.     Choose Save.
    32.     Choose Close.
    33.     Choose F5 to refresh your portal screen.
    3.5     Activated Service Types
    Two synchronous collaboration service providers are included with the initial installation of the Collaboration for SAP Enterprise Portal. The initially installed synchronous collaboration service providers are:
    •     Real Time Collaboration (SAP RTC), which provides application sharing.
    •     WebEx, which provides the WebEx meeting center service.
    Upon installation of Collaboration for SAP Enterprise Portal, synchronous collaboration services are not activated. You must activate synchronous collaboration service types before they can be used.
    Procedure
    34.     Review the list of activated service types.
    Go to System Administration  System Configuration  Knowledge Management  Collaboration  Synchronous Collaboration Services  Activated Service Types
    35.     Choose New button.
    36.     Check the RTC_ApplicationSharing checkbox
    37.     Check the WebEx_WebExMeetingCenter checkbox. 
    38.     Choose OK button.

  • Real Time deployment

    Hi SAP Experts,
    I would like to understand more details on real-time deployment functionality of deployment. Does real time deployment works for multi-level location scenarios ?
    As I understand the real time deployment first runs the SNP heuristics to consider the latest demand supply situation but does it run the heuristics for the entire network for that product or only look at the location connected one level up from the supply location ?
    Example:  I have a Markets A, B and C which are connected to in-stock locations X, Y and Z which send the net requirements to the plants A B C.
    My supply chain is plant 1, 2, 3 are connected to stock locations X, Y and Z and which in turn are supplying to each markets A, B and C.
    If I want to run the real time deployment using supplying plants 1,2 and 3  
      the real time deployment will only deploy receipts to in stock locations X, Y and Z (which is one level up from the plants) or
      Will I see the deployment results at markets A B and C (all through the network upto the markets)
      Does it looks at low level codes in the network while running the heuristics for the product.
    Real time deployment at supply locations only look at in stock locations Z, Y Z or when it runs the heuristics it actually runs the heuristics at the market locations A, B C to update all the net requirements at X, Y, Z and then run heuristics at X, Y, Z to calculate net requirements for plants 1, 2, 3 and the deploy in reverse order up the way to markets from supply plants. Does it looks at low level codes in the network while running the heuristics for the product.
    Thanks and regards,
    Amit

    Hi Amit,
    Like you i am also exploring real time deployment.
    As far as i have understood , real time deployment will run a heursitic between the source and destination location(if specified in SNP02 ) or run heuristic on all t-lane connections between source and all possible destination locations (if dest loc not specified in SNP02) & then take the normal deployment run.
    This will be valid for scenarios wherein you want to consider destination locations demand arriving after the last SNP run.
    Hope this helps.
    Do come back with your testing results.
    Regards,
    Vinay

  • How to implement the Transfer Function in Real Time VIs?

    Hi all,
    I'm relatively new to Labview Real Time modoule and want to implement one Controller(not PID one) in Deterministic Loop! I have already designed discrete Transfer Function and searching for the way to build one Controller with it! Is it right to use the Simulation Loop instead of Timed Loop (in this case Deterministic loop) and implement the controller in it?! If it's correct so, should I use the same clock and Priod as Timed Loop?!
    It would be very helpful, if there is an Example about using transfer function in Real Time Loops!
    thanks for your help,
    Mohsen 

    Hello mhmdi,
    Sorry I can't open your VI (looks like it is a new version of LV than I have installed).
    You're right - you don't need to convert to a difference equation if you have the CD&SIM Module which can take discrete time transfer functions directly. You don't need shift registers with this function, as it is effectively done internally. If you don't have CD&SIM discrete TF's can be implemented easily just in a timed (or while) loop with feedback nodes or shift registers to replicate each z-1 you need.
    Some more ideas:
    There are many configuration parameters and options with the Discrete TF VI - which you need to understand for your application and make sure are correct. Sometimes implementing in a basic form (timed loop and shift registers) allows you to see what is happening without any confusing options you might not need.
    Are you using the Discrete TF VI in a Simulation Loop ? You might need to think if the loop being used is appropriate for a Real Time application. Maybe the timing of the TF, the loop and the DAQmx data are not all suitable for each other.
     I'm not familar with DAQmx, so not sure about any specific real-time aspetcs of that.
    "if the sample frequency of the discrete transfer function in the timed-loop is at multiple integers of the sampling rate (e.g. 12000 Hz for tranfer function and 4000 Hz for sampling rate, 12000=3*4000), would it somehow improve the resolution of the controller command?"
    This could actually make things worse - but depends on your transfer function if it is a problem. Think about it this way - in the scenario you state the input signal going into the Discrete TF will only change once in every three samples, that means if your transfer function includes a 1-z^(-1) factor (i.e. a derivative) - that will be zero for two samples then jump up for the next sample. You'll observe a very jittery/noisy signal, but the noise is due to samples not being correct. This will also occur if the sample times are very similar but go in and out of sync where you may get the occassional sample that is the same and hence the occassional zero in the TF.
    In an application I had we had problems trying to get the data coming into a timed loop and the timed loop itself synchronise, and before it was fixed the control signal was very jittery.
    Consultant Control Engineer
    www-isc-ltd.com

  • Divide key figures in real time cube by using planning function

    Dear all,
    I have a real time cube which contains one key figure which is defined by 10 characteristics.
    For one country, the users types the values fpr the key figure in a wrong format, so I have to divide those values by 1000.
    I have tried to do this with a planning function "Revaluating" and a filter (on this spec. country). I have combined this two element in a planning sequence. When I run this I just get the error message "planning function ended with errors" without any further information.
    Is there an other way to devide values within the cube? the BI release is 7.0.
    Any help would be great.
    Best regards,
    Stefan from Munich/Germany

    Hi Stefanos,
    What you are doing is absolutely right.. However, I want to ask a few question..
    1. Are the characterestic values included in the filter ? ie. If you want to change for Country: UK, is UK present in filter?
    2. Are there any other characteristics that you have not included in the function and are getting changed by chance ? Like you might need to include 0fiscyer in To Change if you are assigning 0Fiscper.
    3. Write a small FOX function with hardcoded values to check if the system is accepting the values or not..
    It should work.. Please write back in case you need more help..
    Regards, Rishi

  • How to create function module in real time

    hi experts,
    can somebody explain a real time requiremwnt , to create a function module.
    its very urgent.
    i want elaborately.
    regards,
    subhasis.

    hi,
    In real time
    1. as per my knowledge we create function modules to Inbound IDOC /outbound processing..
    2.Interface between a program..
      Example we need to take some data from a report say materials and plant and process some bapi and return logfiles to the report we can use FM here too.
    To create a function module
    First you need to create a function group (function group holds a number of function modules relevant)
    SE37>goto>function groups-->create group ..click this create group..
    Now in the pop up enter the function groupname Eg: ZW_FG1 plus short text and saveit
    Now again goto SE37--> enter new Function module name eg:ZW_FM1
    and press F5.
    Now again a popup you need to enter
    function group name : ZW_FG1
    and short test and save it..
    Now the Function module is created under the Function group ZW_FG1.
    write your export import table parameters and you source code.
    rewards if useful
    regards,
    nazeer
    Message was edited by:
            nazeer shaik

  • LabVIEW 7.0 utilize the real-time functions in Window 2000 Real-time.

    Can an application written with LabVIEW 7.0 utilize the real-time functions in Window 2000 Real-time.

    The LabVIEW DATA ACQUISITION HANDBOOK (not sure of the exact title), has examples of simultaneous I/O, which is what I assume you need.
    Clock- controlled analog output (to drive the stepper), and analog input (to read your inputs), sound like the ticket for you.
    You could load the output buffer with a pattern that drives your stepper, and get multiple input samples for every step, all precisely timed.
    Steve Bird
    Culverson Software - Elegant software that is a pleasure to use.
    Culverson.com
    Blog for (mostly LabVIEW) programmers: Tips And Tricks

  • When we use SQLScript/CE Functions or procedures in real time scenario

    Hi,
    Can anyone explain me, when we use Scripted Calculation view or procedure using SQL/CE functions in real time project implementations.
    Let me know some business requirement why we choose these instead of graphical view.

    Hi Kumar,
    <i>So in real time if you want to develop some interface whether we use existing components or we will create own user defined products and components ??</i>
    We can do both.
    If the pre-delivered content actually solves ur purpose by providing the exact format of data, then u should go for the <b>existing components</b>
    If there is any change from the content provided, then u should change it and<b> create own user defined products and components</b>
    <i> What is the criteria behind choosing either of them ??</i>
    Thus the criteria is purely based upon the type of data formats u have to use. Existing component always saves lot of development time
    Regards,
    Prateek

  • How to implement Transfer Function in Real Time VI?!

    Hi all,
    I'm relatively new to Labview Real Time modoule and want to implement one Controller in Deterministic Loop! I have already one discrete Transfer Function and searching for the way to build one Controller with it! Is it right to use the Simulation Loop instead of Timed Loop (in this case Deterministic loop) and implement the controller in it?! If it's correct so, should I use the same clock and Priod as Timed Loop?!
    It would be very helpful, if there is an Example about using transfer function in Real Time Loops!
    thanks for your help,
    Mohsen 

    Hello Mohsen,
    I am sorry but I do not really understand what you are trying to achieve. Could you be a little bit more specific on your goals? Which controller are you talking about?  
    Best regards 
    Florian Abry
    Application Engineer Group Leader
    NI Germany

  • When I use Call Library Function Node in real time, is the DLL loaded once for all or load every time it is called?

    When I use Call Library Function Node in real time, is the DLL loaded once for all or load every time when it is called?
    I have a time critical real time application, in which I use a piece of DLL function developed by C++.  It is OK?  Could any senior developer assure me?
    Thank you in advance.
    Solved!
    Go to Solution.

    qing_shan61 wrote:
    When I use Call Library Function Node in real time, is the DLL loaded once for all or load every time when it is called?
    Once
    qing_shan61 wrote:
    I have a time critical real time application, in which I use a piece of DLL function developed by C++.  It is OK?
    OK
    Be sure that all DLL calls are thread safe (do not perform calls in UI thread).
    Also for real-time application you need real-time OS.
    Andrey.

  • Export audio function in logic..can it be done in real time?

    hey,
    does anyone know if the export audio option can be done in real time as oppose to an offline bounce. there is a sonic difference between exporting files offline and actually bouncing them individually. Enough to actually make me consider going back to the old school way of bouning each track one by one. i obviously don't wnat to do that but and very dissapointed the hear the sonic difference between the two. If i do decide to do that wouald lla to the virtual inst be sample accurate? I know the exs 24 would be but what about spectrasonics stuff or other 3rd party virtual inst. does logic ahve and interanl engine that mkaes them sample accurate?
    thanks in advance,
    ej

    hey justin
    i'm not sure i i get what you mean. if you are saying that i could lets say take 16 outs of logic and digitally record them in the pt then that is true...however logic drifts when locked to timecode which becomes obvious when you have to do mulitple passes (i guess all daws do to so extent). also let say that if i did a stereo track bounce of the music and then imported that bounce into pt for the purposes of recording vox. if i then decide to import the tracks via analog or digital, the track will be out of time with the original trak and require me to start shifting vox to compensate...not fun.
    the thing that i like about exporting and bounce is that i don't have to worry about timing issues that way..it's just that bouncing sounds way better that exporting.
    if i could export in real time then that would solve my problem. I guess i now see why digi has ignored requests foe offline bouncing,
    any thoughts would be appreciated
    ej

  • Does a capture card improve real time functions?

    I am reading posts regarding capture cards and i figured for DV material, capture card may not be necessary as firewire is there. but does capture cards improve the effeciency of the machine /CPU , does it render faster and play more real time effects?
    thanks
    sameer

    The Aurora is a Standard Definition card and the Decklink is an HD card. I don't have them both installed at the same time...I put in what I need. Right now I am working with HD so I have the decklink card in.
    AJA I/O is a standard definition "solution." I say solution because it isn't a capture card per se, but rather an external box that captures SD signals (SDI, Component, Composite) and inputs them into FCP via a firewire connection to the computer. This allows it to be used on multiple machines easily. The Kona series are High Def cards that need to be installed in the PCI slots of your computer.
    What is my workflow with the DSR-11? Probably the same as you. DV/NTSC Easy Setup, then connect it to the computer via firewire and capture. When I am done, I output back to tape. What else is there to do with this?
    Codecs? DV, 8-bit uncompressed SD, DCPRO 50, DVCPRO HD. Depends on the project I am working on and how it was shot.
    10-bit uncompressed DV? DV by nature is compressed 5:1 when it is recorded to tape. When you capture that via firewire from your DSR-11 deck, you are capturing it at the highest possible quality. It is simply a file transfer. Now, if you were delivering on DV then you don't need a capture card. But if you were, say, needing to deliver a digibeta master, or betasp, then you'd need a capture card to output to that format...uprezzing the DV footage to 8-bit uncompressed (better color space, beter codec for graphics and stils) and output. 10-bit is overkill for DV.
    My workflow depends on the footage I am given and the format I need to master to. My HD workflow differs greatly from my DV workflow, which differs from my SD Uncompressed workflow. To go into detail of each would mean writing an article, and I am too busy editing to do that.
    Shane

  • What are the limitations of using labview 8.5.1 developers suite verses a real-time module in field point applications?

    What are the limitations of using labview 8.5.1 developers suite verses a real-time module in field point applications? Can an exe. be loaded onto a field point controller or does the controlling program have to reside on a PC for example?

    centerbolt is correct, you can't load a .exe or even run a program on the fieldPoint controller unless you have the Real Time module.  However, that does not mean you can't use your FieldPoint bank without the Real Time module. 
    From LabVIEW for windows you can make calls to the fieldpoint IO using the fieldpoint read/write functions. 
    This program runs on the PC not the FieldPoint controller.  If you loose network connection to the fieldpoint, your program will loose connection to the IO.  For many data logging applications this type of arrangement can work just fine.  However, if this is the only type of application you are ever going to run, then you may as well not buy the Real Time controller for your fieldpoint but the network controller only. 
    If your application requires more reliability, and/or greater determinism than can be achieved by running a program on windows, then you should use the LabVIEW Real Time module and develop a program that can run down on the FieldPoint controller independent of windows.
    Message Edited by StevenA on 07-22-2008 04:14 PM
    SteveA
    CLD
    FPGA/RT/PDA/TP/DSC
    Attachments:
    fp pallet.PNG ‏6 KB

  • Real time tickets

    Hi friends
    Can any one who are in the support project can give me some of the real time problems from end users-tickets-any critical tickets with solution please
    thanks & Regards
    maunami

    Real Time Issue:
    Billing Document not released to accounting / Accounts determination:
    To resolve the error, you can analyze account determination in the billing document. Process:
    Goto T.Code: VF02 & Enter Invoice number
    Next (On the top most strip) goto Environment
    Next (Select Environment) go to Account determination
    Next (In Account Determination) select Revenue Account Determination (first option)
    This will list all the condition types in the Billing document & analyze each condition & check for which G/L accounts is not determined.
    Possible errors:
    1. VKOA not maintained for required combination
    Solution: Maintain the combination in VKOA.
    2. Account Assignment of Customer / material not maintained in Customer / Material Master (If maintained in combination in VKOA).
    Solution:
    Option 1 (Standard solution):
    step 1: Cancel Billing Document --> Reverse PGI --> cancel Delivery --> Cancel Sales Order
    step 2: Maintain Customer master / Material Master correctly.
    step 3: Recreate sales Order --> Delivery --> PGI --> Invoicing.
    Option 2:
    Force the Account Assignment Group of Customer / Material through Debug in change mode of Billing document, which will release Billing Document to Accounting.
    3. Account Key not maintained in Pricing Procedure:
    Impact: This may create accounting document, but if condition type, which are to be posted to account, but do not have account key maintained in pricing procedure, it will not be post the relevant condition type to G/L account.
    4. Billing Document not being released to accounting --
    In Material Master, there is some link between Profit Centre & MRP Type. If one of it is not maintained, erratically few documents get stuck while releasing Billing Document to accounting. Few of course get posted.
    Solution1: Cancel Billing Document --> Reverse PGI --> Cancel Delivery --> Block the sales Order & Create new sales Cycle all over again after rectifying Material master.
    Solution 2: (Temporary Solution) In Debug mode in Billing, force the Profit Center in Billing Document with the help of Abaper. But ensure Material master is rectified.
    From FI Side, you require to check that all the G/L account has been maintained through T.Code: FS00. G/L account being Master data has to be created in each client to upload through LSMW / SCATT / BDC.
    In Billing Document in change mode (in the first screen where we enter Billing Document number), on the top most left hand corner, take a dropdown on Billing Document & select Release to accounting. Here you can get the under mentioned possible message:
    1. G/L account not found
    2. Cost Element not maintained for G/L account.
    In both the above cases, FI consultant requires to take corrective action.
    Pricing:
    This is very specific & differs from client to client & may also differ based on scenario.
    Write-up on Pricing -
    In SD, Pricing Procedure is determined based on Sales Area (Sales Organization + Distribution Centre + Division) + Customer Pricing Procedure + Document Pricing Procedure. Sales Area is determined in Sales Order Header Level. Customer Pricing Procedure is determined from Customer Master. Document Pricing Procedure is determined from Sales Document Type / Billing Type (if configured). Once the pricing procedure is determined, Condition records are fetched. If appropriate condition records are found, the price is determined. If Mandatory pricing condition is missing, system will through an error message.
    In SD, the steps to configure Pricing procedure are as under:
    Step 1:
    Condition table: If existing condition table meets the requirement, we need not create a new condition table. Considering the requirement for new condition table, the configuration will be done in spro as follows: IMG --> Sales & Distribution --> Basic Function --> Pricing Control --> Condition Table (select the required fields combination, which will store condition record).
    Step 2:
    Access Sequence: If existing access sequence meets the requirement, we need not create a new access sequence. Considering the requirement for new sequence, the configuration will be done in spro as follows: IMG --> Sales & Distribution --> Basic Function --> Pricing Control --> Access Sequence (Access sequence is made up of Accesses (Tables) & the order of priority in which it is to be accessed. Here we assign the condition table to access sequence.
    Step 3:
    Condition Type: If existing condition type meets the requirement, we need not create a new condition type. Considering the requirement for new condition type, the configuration will be done in spro as follows: IMG --> Sales & Distribution --> Basic Function --> Pricing Control --> Condition Type. It is always recommended to copy an existing similar condition type & make the necessary changes. Here we assign Access sequence to Condition type.
    Step 4:
    a. Pricing Procedure: It is recommended to copy a similar pricing procedure & make the necessary changes in new pricing procedure. Pricing Procedure is a set of condition type & arranged in the sequence in which it has to perform the calculation. Considering the requirement for new Pricing Procedure, the configuration will be done in spro as follows: IMG --> Sales & Distribution --> Basic Function --> Pricing Control --> Pricing Procedure --> Maintain Pricing Procedure.
    b. Pricing Procedure: After maintaining the pricing procedure the next step will be determination of pricing procedure. Configuration for determining pricing procedure in SPRO is as follows: IMG --> Sales & Distribution --> Basic Function --> Pricing Control --> Pricing Procedure --> Determine Pricing Procedure.
    5. Condition record: Condition record is a master data, which is required to be maintained by Core team / person responsible from the client. During new implementation, the condition records can be uploaded using tools like SCAT, LSMW, etc.
    It is assumed that document pricing procedure, customer pricing procedure , ... are in place.
    Sales Document not assigned to Sales Area:
    SPRO --> Sales & Distribution --> Sales --> Sales Documents --> Sales Document Header --> Assign Sales Area To Sales Document Types --> Assign sales order types permitted for sales areas (do ensure to maintain combined Sales organization, combined Distribution channel & combined division. for eg: Sales org 1000 & sales org 1000, Sales org 2000 & sales org 2000, & so on ....
    similarly for distribution channel & Division, so that the Sales area combination is available for assignment to Sales Document Type.)
    Issues related to Customer Master data:
    1. what is the impact of leaving customer pricing procedure & customer statistic group blank in customer master --> sales area data --> sales tab:
    If Customer Pricing Procedure is left blank, Pricing will not be determined.
    If customer statistic group is left blank, then data will not flow to standard reports.
    2. Who maintains reconciliation account in customer master?
    Ideally, reconciliation account is maintained by FI person, but if SD person is authorized & has the knowledge of which reconciliation account to be maintained, then even SD person can maintain the same.
    3. Terms of payment appear in Company Code Data & sales Area Data. What is the impact of each? why is it not populated automatically, once it is maintained at either field?
    Terms of payment from company code data is for reporting purpose & it is from sales area data that it flows to sales order.
    It is a standard feature of SAP that it is not populated automatically if maintained at either of the field, but it is a must in sales area data & can be skipped in company code data.
    4. Unable to select Sales Area for Customer Master Creation?
    Most Probably either sales area is not defined or customization not done for common Sales Org & Common Distribution Channel. To maintain this configuration: SPRO --> Sales & Distribution --> Master Data --> Define Common Distribution Channels / Define Common Divisions
    The above are few examples.
    /message/3690162#3690162 [original link is broken]
    Regards,
    Rajesh Banka
    Reward suitable points.
    How to give points: Mark your thread as a question while creating it. In the answers you get, you can assign the points by clicking on the stars to the left. You also get a point yourself for rewarding (one per thread).

Maybe you are looking for

  • Connecting mini mac to 2009 20" imac

    I am trying to connect a mini mac to my 2009 imac with 20" monitor via Thunderbolt wire. In the system inbformation dialogue window the status is no hardware found. I have attached a screen capture for reference as well as a screen capture about the

  • DNS records are not 100% correct

    For a while now we've been noticing that some DNS records are not correct. The records are pointing to incorrect IP addresses. One by one I open the record, update the IP, then replicate across all domain controllers. What would cause the hostname of

  • Blu-Ray Creator does not seem to work- has anyone used with success

    I just purchased Blu-Ray Creator from the app store and it will not let me backup a hardcopy Blu-Ray disk onto my hard drive. Has anyone bought it and got it to work?

  • How do i change jdk for iplanet Application server

     

  • Mail app and .mac integration

    I wonder if someone can help. I have previously often drafted email on my home power book using mail app and saved as drafts and then worked on them later when away from home through .mac. This worked perfectly with 10.3 but with 10.4 messages are co