Log NC based on data collection

Is it possible to trigger the logging of an NC based on a data collection value being outside the acceptable range?
ie. Acceptable range for the data collection is a number less than 6, if the user enters 7 I would like to log an NC that says the data collection is out or range.

To summarize:
What I'm taking away from this is that it is the best practice to have only one parameter per DC group if you intend to trigger the automatic logging of an NC when that group "fails." The one parameter in the DC group MUST have a min/max value assigned and a fail is triggered when the operator enters a value outside of that range.  The NC is logged using the value assigned to the LOGNC_ID_ON_GROUP_FAILURE parameter in activity maintenance.
If there are multiple parameters in the DC group, they all have to have a min/max value assigned and ALL of the responses have to be out of range in order to fail the SFC.
I cannot have a DC group that contains parameters of multiple types and expect an NC to be logged based on an incorrect answer (for one question or multiple.)
I cannot expect an NC to be logged based on an incorrect answer of one question, if the rest of the questions in the DC group are answered "correctly."
Sound correct?
Edited by: Allison Davidson on Apr 18, 2011 10:06 AM  - typo

Similar Messages

  • ATP Based on Data Collection

    Hi,
    I am working on a fresh implementation project where I want to implement Scheduling Features of OM. (Based upon Collected Data)
    I have already applied Patch related to R12.1.2 and also ran ATP Partition Program. Now, all the table and Partition names are in Pair. I have dropped unwanted partitions as well. Setup Profile Options
    Now, my question is, at what time, I need to run Data Collection? (After loading all the items / loading all the open transactions)
    How Often, I need to run Data Collection?
    Do I need to select all the organizations while setting up the instance?
    Appreciate your timely help.
    There are few notes available on Metalink related to Data Collection and related Setup but it will be really great if someone can help me setup and understand this feature.

    Now, my question is, at what time, I need to run Data Collection? (After loading all the items / loading all the open transactions).
    Run the DC after loading all items and onhand quantities and open suppliers (po, work orders) etc.
    How Often, I need to run Data Collection?
    You should run a full DC every night and schedule a program called "Continue collection" every 20 minutes or so.
    Do I need to select all the organizations while setting up the instance?
    No. Setup only those orgs which will impact ATP.
    Hope this answers your question
    Sandeep Gandhi
    Omkar Technologies Inc.
    Independent Techno-functional Consultant
    513-325-9026

  • Log files/troubleshooting performance data collection

    Hello: 
    Trying to use MAP 9.0, 
    When doing performance data collection, getting errors.  Is there a log file or event log that captures why the errors are occurring?
    One posting said to look in bin\log - but there is no log directory under BIN for this version it seems.  
    Thank you, 
    Mustafa Hamid, System Center Consultant

    Hi Mark,
    There's no CLEANER_ADJUST_UTILIZATION in EnvironmentConfig for BDB JE 5.0.43 which I'm currently using, I also tried
       envConfig.setConfigParam("je.cleaner.adjustUtilization",
              "false");
    it fails to start up with below error
    Caused by: java.lang.IllegalArgumentException: je.cleaner.adjustUtilization is not a valid BDBJE environment parameter
        at com.sleepycat.je.dbi.DbConfigManager.setConfigParam(DbConfigManager.java:412) ~[je-5.0.43.jar:5.0.43]
        at com.sleepycat.je.EnvironmentConfig.setConfigParam(EnvironmentConfig.java:3153) ~[je-5.0.43.jar:5.0.43]

  • Datalogging with options to retrieve subset of log file based on date/time

    I would like to thank this forum for useful advice so far in completing my LabVIEW software.
    I have a data logging challenge. I am supposed to log about 30 parameters every 5 seconds. Some of these parameters are digital (ON/OFF), some are values of speed (rpm) and others, an expression of a percentage (%). It should be possible in future to do a histogram or bar chart plot of some of the parameters, for a specific period range (say the last 5 minutes of a certain day). So in effect, do an extraction of a segment of the total log file.
    My challenge is if I use text file, like the one in the attached VI, can it give functionality of retrieving data (while the VI is running) from the log file, based on a certain time range (i.e. retrieve a section of the log file based on a certain date/time range, on demand)?
    The format in the text file is close to what I require, since it lists the time n one column and the other parameters on other columns to enable future histogram generation.
    Thanks a lot, friends.
    Solved!
    Go to Solution.
    Attachments:
    writer.vi ‏19 KB
    time.txt ‏1 KB

    Hey maxidivine,
    Iv been playing round with your code and found that to perform the search that you require could be quite demanding to system resources when scaled to the size of your application I shall try and find a way to perform the search using .txt files but the there are some other options available. I recommend the use if TDMS files as the file format is a very efficient, manageable method of data-logging. The TDMS file format is designed to write and read measured data at a very high speed, while maintaining a hierarchical system of descriptive information.
    Traditionally, TDMS was a National Instruments only file format – you could only read it using our products – LabVIEW/CVI/DIAdem. However, thanks to the popularity of the format, a bolt-on is now available for Excel, which allows you to directly open the .tdms files with Excel (see link).
    National Instruments Technical Data Management Overview
    http://zone.ni.com/devzone/cda/tut/p/id/3676
    Introduction to LabVIEW TDM Streaming Vis
    http://zone.ni.com/devzone/cda/tut/p/id/3539
    VI-Based API for Writing TDMS Files
    http://zone.ni.com/devzone/cda/tut/p/id/6471
    TDM Excel Add-In Tool for Microsoft Excel User Guide
    http://zone.ni.com/devzone/cda/tut/p/id/4906
    TDM Excel Add-In for Microsoft Excel Download
    http://zone.ni.com/devzone/cda/epd/p/id/2944
    Troubleshooting the TDM Excel Add-In for Microsoft Excel 2000-2003
    http://zone.ni.com/devzone/cda/tut/p/id/5874
    Examples of the use of the TDMS API ship with LabVIEW. You will find them in HELP > find examples > fundamentals > File Input and Output. For you application, I would recommend the “Cont Acq&Graph Voltage - Write Data to File (TDMS).vi”.
    Furthermore, if you require some help with DIAdem, I would recommend clicking "getting started" from the DIAdem splash screen. This opens a manual which discusses everything from data analysis to report generation. Also, if you have DIAdem 11 or above, there are tutorial videos which install with DIAdem. These are useful little tutorials, which discuss all the DIAdem fundamentals. You can access these by selecting a particular palette tab (eg. report, view, analysis...etc) and then clicking the tutorial button (shown as a film strip with a question mark) at the top of the group view.
    Here are some more helpful DIAdem related resources for future reference.
    Report Gen in DIAdem...
    http://zone.ni.com/devzone/cda/tut/p/id/7379
    DataPlugins: Supported Data Formats (ni.com/dataplugins)
    http://zone.ni.com/devzone/cda/tut/p/id/4065
    Hope this is helpful
    Philip
    Philip
    Applications Engineer
    National Instruments
    UK Branch
    ===If this fixes your problem, mark as solution!===

  • Can I set alarms based on data collected in a FormsCentral calendar field?

    For example, if one of my fields is "CPR certification due date" and the data entered is 5/5/18, can I get an alarm on that day saying I am due for CPR certification?

    FormsCentral is a forms-publishing and response-collecting system. Once you have the data, you could import it into a database and then have the database generate actions based on particular reports you run, but there is no such functionality in FormsCentral.

  • How to create monitor based on the data collected by rule?

    I have an application which comes with a built-in command: GetBackLogNm.exe. Running this command can get the current backlog number in the system.
    Now I need to create a monitor which should trigger the alert if the backlog keeps increasing for 2 hours. I am thinking of creating a rule which run command GetBackLogNm.exe to get the number and save it into DW. Then create a monitor based on the collected
    data. Is it the correct way? If so, can anyone give me a outline how to do so? An example would be much appreciated if possible.
    Thanks!

    Hi Jonathan, thanks for the quick reply. But System.ConsolidatorCondition seems only available for SCOM 2012. I didn't find it in 2007 libraries.
    However the idea your and Vladimir showed in this
    post is really helping. I kind of solved the issue not using the exactly the components (since it is available in 2012 only) but following the same logic.
    In general I did this:
    1. create a script rule which execute the command to get backlog number; then save the log number in the DW and the registry.
    2. just like System.Performance.DeltaValueCondition, when the second time the rule triggered by schedule, it will compare the current backlog number and the last known number stored in the registry. If the value is exceeding given threshold AND bigger than
    the last known back log number, the rule increase an counter which is also stored in the registry
    3. create an unit monitor to monitor the counter stored in the registry and will trigger the alert if the counter exceeding a given value.
    This solved the issue. But I feel a little not comfortable to store data in the registry.
    So one quick question which might be out of the this topic: what is a good way to store the data generated by one workflow and then share it with another workflow or reuse sometime later? In my case here, I used the registry key. Cook down is one option
    to share data but it is very strict and can't solve the issue I have in this case.
    Thanks again!

  • Sluggish data collection will log but not plot

    Please be gentle with another newbie here. Unfortunately, I am stuck using LV6 on a Windows XP machine so I am limited in some of the options I have to control data logging and event structures. However, I have done the best I can for the application with what I have learned of LabView. I am trying to set up a multichannel, continuous (long-term) data collection from the serial port which will send the data to a chart and a log. I have tried to build my own event structure to tell it to collect faster if there is a change in the data value and to collect slower when the change in the data is minimal (based on the mean values).
    Any ideas on why this is running so sluggishly and not charting?
    Thanks for all input and help!!
    Attachments:
    4 Channel Temp Monitor_latest.vi ‏1170 KB

    Some things I see.
    1.  You are setting a lot of properties for the charts on every iteration, along with property nodes for some other controls.  Particularly the ones involving scaling.  These cause the UI interface to need to update often.  I would recommend only writing values to property nodes in the event that something changes.  If you can use an event structure, great.  If not, just compare the old value to the new value and only write out values if they are different.
    2.  I can't tell if anything controls the speed of you main while loop.  You might want to put in a small wait statement.
    3.  Don't open and close the serial port on every iteration.  You are actually doing it several times within an iteration.  Open it once, Read and Write to it in a loop. Close the port when the program ends after the loop.
    4.  Some of the stacked sequence structures seem suspect.  Some are using dequeues from the same queue in every frame, only to OR all the data together at the end.  It seems like a For Loop would be a better choice.
    5.  Do all your graphs need to be single representation?  Make them double.  You can also avoide the bullet conversion from double to single in your Scan from String functions if you wire a single representation constant into the type terminal of the Scan from String function.
    I'm sure there are more things that could be fixed, but I really suspect #1 and #2 as the main problems as to why your code seems sluggish.

  • Implement log based change data capture

    Hi,
    I am trying to get log based change data capture to work . My ODI version is 11.1.1.5. I guess for log based there are 2 ways:
    1) use streams
    2) use log miner tool
    My database is Oracle 11g Express Edition. Streams i know can be possible only in enterprise edition of Oracle. So can anyone tell me how to implement log based CDC then since logminer tool is not preferred to be used in 11g

    Hi,
    Thanks for ur reply...
    I received an error while creating the change table ..
    ORA-29540: class oracle/CDC/PublishApi does not exist
    ORA-06512: at "SYS.DBMS_CDC_PUBLISH", line 298
    Canu pls help me to fix this..
    by,
    Nagaa

  • Data collection was switched from an AI Config task writing to an hsdl file to synchronized DAQmx tasks logging to TDMS files. Why are different readings produced for the same test?

    A software application was developed to collect and process readings from capacitance sensors and a tachometer in a running spin rig. The sensors were connected to an Aerogate Model HP-04 H1 Band Preamp connected to an NI PXI-6115. The sensors were read using AI Config and AI Start VIs. The data was saved to a file using hsdlConfig and hsdlFileWriter VIs. In order to add the capability of collecting synchronized data from two Eddy Current Position sensors in addition to the existing sensors, which will be connected to a BNC-2144 connected to an NI PXI-4495, the AI and HSDL VIs were replaced with DAQmx VIs logging to TDMS. When running identical tests, the new file format (TDMS) produces reads that are higher and inconsistent with the readings from the older file format (HSDL).
    The main VIs are SpinLab 2.4 and SpinLab 3.8 in folders "SpinLab old format" and "Spinlab 3.8" respectfully. SpinLab 3.8 requires the Sound and Vibration suite to run correctly, but it is used after the part that is causing the problem. The problem is occuring during data collection in the Logger segment of code or during processing in the Reader/Converter segment of code. I could send the readings from the identical tests if they would be helpful, but the data takes up approximately 500 MB.
    Attachments:
    SpinLab 3.8.zip ‏1509 KB
    SpinLab 2.4.zip ‏3753 KB
    SpinLab Screenshots.doc ‏795 KB

    First of all, how different is the data?  You say that the reads are higher and inconsistent.  How much higher?  Is every point inconsistent, or is it just parts of your file?  If it's just in parts of the file, does there seem to be a consistent pattern as to when the data is different?
    Secondly, here are a couple things to try:
    Currently, you are not calling DAQmx Stop Task outside of the loop; you're just calling DAQmx Clear Task.  This means that if there were any errors that occured in the logging thread, you might not be getting them (as DAQmx Clear Task clears outstanding errors within the task).  Add a DAQmx Stop Task before DAQmx Clear Task to make sure that you're not missing an error.
    Try "Log and Read" mode.  "Log and Read" is probably going to be fast enough for your application (as it's pretty fast), so you might just try it and see if you get any different result.  All that you would need to do is change the enum to "Log and Read", then add a DAQmx Read in the loop (you can just use Raw format since you don't care about the output).  I'd recommend that you read in even multiples of the sector size (normally 512) for optimal performance.  For example, your rate is 1MHz, perhaps read in sizes of 122880 samples per channel (something like 1/8 of the buffer size rounded down to the nearest multiple of 4096).  Note: This is a troubleshooting step to try and narrow down the problem.
    Finally, how confident are you in the results from the previous HSDL test?  Which readings make more sense?  I look forward to hearing more detail about how the data is inconsistent (all data, how different, any patterns).  As well, I'll be looking forward to hearing the result of test #2 above.
    Thanks,
    Andy McRorie
    NI R&D

  • How to get the most current file based on date and time stamp using SSIS?

    Hello,
    Let us assume that files get copied in a specific directory. We need to pick up a file and load data. Can you guys let me know how to get the most current file based on date and time stamp using SSIS?
    Thanks
    thx regards dinesh vv

    hi simon
    i excuted this script it is giving error..
       Microsoft SQL Server Integration Services Script Task
       Write scripts using Microsoft Visual C# 2008.
       The ScriptMain is the entry point class of the script.
    using System;
    using System.Data;
    using Microsoft.SqlServer.Dts.Runtime;
    using System.Windows.Forms;
    namespace ST_9a6d985a04b249c2addd766b58fee890.csproj
        [System.AddIn.AddIn("ScriptMain", Version = "1.0", Publisher = "", Description = "")]
        public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
            #region VSTA generated code
            enum ScriptResults
                Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
                Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
            #endregion
            The execution engine calls this method when the task executes.
            To access the object model, use the Dts property. Connections, variables, events,
            and logging features are available as members of the Dts property as shown in the following examples.
            To reference a variable, call Dts.Variables["MyCaseSensitiveVariableName"].Value;
            To post a log entry, call Dts.Log("This is my log text", 999, null);
            To fire an event, call Dts.Events.FireInformation(99, "test", "hit the help message", "", 0, true);
            To use the connections collection use something like the following:
            ConnectionManager cm = Dts.Connections.Add("OLEDB");
            cm.ConnectionString = "Data Source=localhost;Initial Catalog=AdventureWorks;Provider=SQLNCLI10;Integrated Security=SSPI;Auto Translate=False;";
            Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
            To open Help, press F1.
            public void Main()
                string file = Dts.Variables["User::FolderName"].Value.ToString();
                string[] files = System.IO.Directory.GetFiles(Dts.Variables["User::FolderName"].Value.ToString());
                System.IO.FileInfo finf;
                DateTime currentDate = new DateTime();
                string lastFile = string.Empty;
                foreach (string f in files)
                    finf = new System.IO.FileInfo(f);
                    if (finf.CreationTime >= currentDate)
                        currentDate = finf.CreationTime;
                        lastFile = f;
                Dts.Variables["User::LastFile"].Value = lastFile;
                Dts.TaskResult = (int)ScriptResults.Success;
    thx regards dinesh vv

  • Too many BPM data collection jobs on backend system

    Hi all,
    We find about 40,000 data collection jobs running on our ECC6 system, far too many.
    We run about 12 solutions, all linked to the same backend ECC6 system. Most probably this is part of the problem. We plan to scale down to 1 solution rather than the country-based approach.
    But here we are now, and I have these questions.
    1. How can I relate a BPM_DATA_COLLECTION job on ECC6 back to a particular solution ? The job log give me monitor-id, but I can't relate that back to a solution.
    2. If I deactivate a solution in the solution overview, does that immediately cancel the data collection for that solution ?
    3. In the monitoring schedule on a business process step we sometimes have intervals defined as 5 minutes, sometimes 60. Strange thing is that the drop-down of that field does not always give us the same list of values. Even within a solution I see that in one step I have the choice of a long list of intervals, in the next step in that same business process I can only choose between blank and 5 minutes.
    How is this defined ?
    Thanks in advance,
    Rad.

    Hi,
    How did you managed to get rid of this issue. i am facing the same.
    Thanks,
    Manan

  • Utility data collection job Failure on SQL server 2008

    Hi,
    I am facing data collection job failure issue (Utility-data Collection) on SQL server 2008 server for, below is the error message as  :
    <service Name>. The step did not generate any output.  Process Exit Code 5.  The step failed.
    Job name is collection_set_5_noncached_collect_and_upload, as I gothrough the google issue related to premission issue but where exactly the access issues are coimng, this job is running on proxy account. Thanks in advance.

    Hi Srinivas,
    Based on your description, you encounter the error message after configuring data collection in SQL Server 2008. For further analysis, could you please help to collect detailed log information? You can check the job history to find the error log around the
    issue, as is mentioned in this
    article. Also please check Data Collector logs by right-clicking on Data Collection in the Management folder and selecting View Logs.
    In addition, as your post, the exit code 5 is normally a ‘Access is denied ’ code.  Thus please make sure that the proxy account has admin permissions on your system. And ensure that SQL Server service account has rights to access the cache folder.
    Thanks,
    Lydia Zhang

  • Data Collection job collection_set_2_upload failed

    We had a standalone server which was SQL server 2000 R2. I had configued the Data Collection job and uploded the data into our Management Data Warehouse.  It had worked fine.  One week ago,  the standalone server was re-built and
    then I found that all 4 jobs didn't work.
    collection_set_2_collection
    collection_set_2_upload
    collection_set_3_collection
    collection_set_3_upload
    I re-configured both Management Data Warehouse and Data Collection. Now the two collection_set_2_collection and collection_set_3_collection worked fine.  However the two collction_set_2_upload and collection_set_3_upload didn't work with an error "the
    step did not generate any output".  I cleaned up all DataCollectorCache files on the standalone server but the error stayed.
    Any idea?  Thank you for any kind of suggestion.
    Charlie He

    Hi Charlie,
    Based on my understanding, you configured Data Collection job. Then two upload jobs didn't work with an error "the step did not generate any output". And error also existed after cleaning up Data Collector cache files.
    New data cannot be uploaded to the Management Data Warehouse database when one or more Data Collector cache files are corrupted. The corruption may be caused for one of the following reasons:
    Data Collector encountered an exception.
    The disk runs out of free space while Data Collector is writing to a cache file.
    A firmware or a driver problem occurs.
    So I suggest you double check if Data Collection cache files are cleaned up completely. For more information, please refer to this kb:
    http://support.microsoft.com/kb/2019126.
    If issue persists exist, please check the agent job history to find the error log around the time issue caused. It would be better if you could provide detail error information for our further analysis. About how to check agent job history, please refer
    to this link:
    http://msdn.microsoft.com/en-us/library/ms181046(v=sql.110).aspx.
    Best regards,
    Qiuyun Yu

  • ASCP Data Collection Fails

    Dear Members :
    For sake of learning, ATP in R12.1.1 Vision instance, I created a simple ATP rule with Demand as Sales Order and Supply as PO, On-hand Available.
    I followed all the steps as below:
    1) After creating ATP rule and attaching it to the order mgmt tab in item master, enabled ATP at the org level.
    2) created sourcing rule for that item and assigned it to an assignment set(Supplier Scheduling).
    3) This assignment set appears by default to MRP:ATP assignment set profile option
    4) Set INV : Capable to Promise to ATP/CTP based on collected data, at responsibility level
    5) Include organizations to instance TST(default)
    6) Run ATP Data collection Program.
    RCS is errored out with User-Defined Exception (Source Setup Objects Creation Requests did not complete Successfully) along with Collection Triggers/Views errored out with
    Error with creating Source Triggers/Views
    Is it my setup problem or patch I need to apply, if you could please guide ?
    Thanks in advance.
    Atanu

    Following are in log for them:
    **Starts**03-JAN-2011 20:00:52
    **Ends**03-JAN-2011 20:12:22
    Error in creating Source Triggers
    Start of log messages from FND_FILE
    03-JAN 20:00:52 : Request : 5807365 :Creates Item Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807366 :Creates BOM Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807367 :Creates Routing Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807368 :Creates WIP Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807369 :Creates Demand Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807370 :Creates Supply Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807371 :Creates Other Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807372 :Creates Repair Order Triggers used by Collections Process
    End of log messages from FND_FILE
    ***********************************AND that of Source Views details :**************************************************************
    **Starts**03-JAN-2011 20:00:47
    **Ends**03-JAN-2011 20:12:18
    Error in creating Source Views
    Start of log messages from FND_FILE
    03-JAN 20:00:47 : Request : 5807356 :Creates Setup Views used by Collections Process
    03-JAN 20:00:47 : Request : 5807357 :Creates Item Views used by Collections Process
    03-JAN 20:00:47 : Request : 5807358 :Creates BOM Views used by Collections Process
    03-JAN 20:00:47 : Request : 5807359 :Creates Routing Views used by Collections Process
    03-JAN 20:00:47 : Request : 5807360 :Creates WIP Views used by Collections Process
    03-JAN 20:00:47 : Request : 5807361 :Creates Demand Views used by Collections Process
    03-JAN 20:00:48 : Request : 5807362 :Creates Supply Views used by Collections Process
    03-JAN 20:00:48 : Request : 5807363 :Creates Other Views used by Collections Process
    03-JAN 20:00:48 : Request : 5807364 :Creates Repair Order Views used by Collections Process
    End of log messages from FND_FILE
    Executing request completion options...
    Output is not being printed because:
    The print option has been disabled for this report.
    Finished executing request completion options.
    Concurrent request completed
    Current system time is 03-JAN-2011 20:12:18
    ======================================================================
    Thanks
    Atanu

  • Data Collection -- Planning Data Pull process failed

    Hello Experts,
    I am having a problem with data collection, the result of Planning Data Pull was completed error. It seems the problem is on the Planning Data Pull Worker.
    I am using 2 worker to run Planning Data Pull, and both result was errors. Below are some of the errors log:
    16-MAY 19:11:34 : Procedure MSC_CL_SETUP_PULL.LOAD_CALENDAR_DATE started.
    Into populate_rsrc_cal
    16-MAY-2011 19:11:40
    APS string is Invalid, check for Error condition
    16-MAY 19:11:40 : Error in Routine GMP_CALENDAR_PKG.POPULATE_RSRC_CAL.
    16-MAY 19:11:40 : User-Defined Exception
    16-MAY 19:11:40 : User-Defined Exception
    16-MAY 19:11:40 : Error_Stack...
    16-MAY 19:11:40 : ORA-06510: PL/SQL: unhandled user-defined exception
    16-MAY 19:11:40 : Error_Backtrace...
    16-MAY 19:11:40 : ORA-06512: at "APPS.MSC_CL_PULL", line 6218
    ORA-06512: at "APPS.MSC_CL_PULL", line 1583
    Help me please.
    Thanks & Regards,
    Andi

    This error shows up when it cannot find any organizations.
    Use the sqls below to identify this
    1. select MSC_CL_PULL.get_org_str(&instance_id) from dual ;
    2. select instance_code, instance_id from msc.msc_apps_instances where instance_code='&instance_code'
    3. select * from apps.msc_instance_orgs where sr_instance_id= &instanc_id
    sql 1 should in principle return the orgs available. If this returns null or -9999, then it's an orgs problem
    sql 3 will show the available orgs.
    If the organizations are enabled for planning in the ASCP instances form, then run a targeted refresh for trading partners (suppliers/customers/orgs). If this works with no errors, then look at the table msc_trading_partners. It should contain records for the orgs with partner_type = 3

Maybe you are looking for

  • No more "current and next month" list on left of iCal

    Since the upgrade to Lion, the current and next month list on the left is gone. When I want to reschedule an event, I used to look on the left to see the dates in the next month, or current month if in week view etc .... now it's not there. If I want

  • How can you add a hyperlink to a picture on iBooks Author?

    I want to link social media icons to the websites from my iBook that i am making in iBooks AUthor but I can't do it. I can add a hyperlink to text only.Does anyone know how to do this?

  • Connect Cintiq 12wx to my MacBook Pro 17

    What kind of cable do I need to make this connection?  MacBook is only one year old.

  • Edit in Photoshop - Image Reset?

    It appears that when image is opened in PS via "Edit in" command, all develop settings are removed and image is back to as-shot colors, contrast, etc. How can I avoid this, and edit/print the image as it appears in LR?

  • Not able run the report in PDF format

    hi I'm using Report Builder 9.0.4.0.33 (10g). I'm able to generate salry slip for 1 emplyee,2 or 3 in pdf format but when it is more than 1000, in pdf format only it gives following error REP-1922: Internal error REP-50125: Caught exception: java.io.