Python, numpy, values of dicts, and numpy.where

if I have a numpy.array value that is a dict, say, { 'this' : 'that' }, how can I use numpy.where() to return array coordinates for values that are a dict?  Or even better, dicts with an index of 'this' in them?  I'd much prefer to do this without a loop (at that would be very slow).  Thanks!

One way to do this without a loop:
def myfunc(x):
return type(x) == dict and 'this' in x
myfuncv = numpy.frompyfunc(myfunc, 1, 1)
indices = numpy.where(myfuncv(some_array))
However, you're not likely to see a significant speed improvement over calling myfunc in a loop, since either way, you're still calling a Python function for every element of the array. Unfortunately, I don't see any way around this, since determining the type of an object and testing for a key in a dict are fundamentally Python operations. If you disagree, perhaps you should clarify your problem a bit and give some more context.

Similar Messages

  • Where we use Conversions? like Fixed Value and Value mapping? and use of th

    where we use Conversions? like Fixed Value and Value mapping?
    and what is the use of that....

    Hi
    Have a Look at this
    /people/community.user/blog/2007/01/08/valuemapping-using-the-graphical-mapping-tool
    Value mapping is used to map different representations of an object to each other.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/9202d890-0201-0010-1588-adb5e89a6638
    You can do value mapping both in IR and ID
    IR: Message Mapping
    http://help.sap.com/saphelp_nw04/helpdata/en/43/c4cdfc334824478090739c04c4a249/content.htm
    /people/sreekanth.babu2/blog/2005/01/05/design-time-value-mappings-in-xslt
    ID: Tools --> Value Mapping
    http://help.sap.com/saphelp_nw04/helpdata/en/13/ba20dd7beb14438bc7b04b5b6ca300/content.htm
    Value mapping replication: For uploading data from tables, excel sheet etc. into XI
    http://help.sap.com/saphelp_nw04/helpdata/en/2a/9d2891cc976549a9ad9f81e9b8db25/content.htm
    Value mapping
    /people/siva.maranani/blog/2005/08/23/lookup146s-in-xi-made-simpler
    /people/sravya.talanki2/blog/2005/12/21/use-this-crazy-piece-for-any-rfc-mapping-lookups
    /people/sap.user72/blog/2005/12/06/optimizing-lookups-in-xi
    /people/alessandro.guarneri/blog/2006/03/27/sap-xi-lookup-api-the-killer
    Thanks

  • Where i can find the value categry code and element in cjvc

    hi gurus
    where i can find the value categry code and element in cjvc .in which table these fields are stored.could anyone tell me relation.
    Thanks & Regards
    Sandya

    not answer

  • Python segmentation fault with matplotlib and opencv2

    Once again fellow archers, I am in need of your help.
    I have a file, Test1.py, which uses cv2, pylab, and numpy.
    Excluding either pylab or cv2 functions, I can make the rest of the script work, but when pylab is imported, calling cv2's imshow() function causes a "segmentation fault (core dumped)".
    By the sound of it, I thought it might come from the underlying C stuff, rather than python.
    So I ran it through gdb.
    This is where I need help. I'm not sure how to interprete the backtrace, and I probably wouldn't know what to to if I were.
    Perhaps a problem with glib or GObject?
    Below are the python script in question (it a template for a school assignment), and the gdb trace.
    Test1.py
    import sys
    sys.settrace
    import numpy as np
    import cv2
    import math
    import pylab
    #----------------------------Functions
    def show1_OpenCV(image):
    # ##This function define a window by namedWindow() and then show the image in that window
    image=np.array(image)
    # cv2.namedWindow('ByOpenCV', cv2.WINDOW_AUTOSIZE)## create a window called
    # #+'By OpenCV'Using the \nw{cv2.WINDOW_AUTOSIZE} parameter when defining a window display the image with its actual size in the window.
    cv2.imshow('ByOpenCV', image) ## show the image in 'By OpenCV' window
    # cv2.waitKey(0) ## the window will be closed with a (any)key press
    def show2_OpenCV(*image):
    ## Showing the image using OpenCV
    im=[]
    for i in image:
    im.append(array(i))
    cv2.namedWindow('1', cv2.WINDOW_AUTOSIZE)## create a window called 'By OpenCV' Using the \nw{cv2.WINDOW_AUTOSIZE} parameter when defining a window display the image with its actual size in the window.
    cv2.imshow('1', im[0]) ## show the image in 'By OpenCV' window
    cv2.namedWindow('2', cv2.WINDOW_AUTOSIZE)
    cv2.imshow('2', im[1])
    cv2.waitKey(0) ## the window will be closed with a (any)key press
    def showAll_OpenCV(**image):
    im = []
    for (k,v) in image.items():
    cv2.namedWindow(str(k), cv2.WINDOW_AUTOSIZE)
    cv2.imshow(k, np.array(v))
    cv2.waitKey()
    def show1_pylab(image):
    ## Showing the image using pylab
    figure("By pylab")## Create a figure
    gray()
    title("1"); imshow(image)
    show()
    def show2_pylab(*images):
    ## Showing the image using pylab
    figure("By pylab")## Create a figure
    gray()
    subplot(1,2,1);title("1"); imshow(images[0])## more about 'subplot()' : <http://www.scipy.org/Cookbook/Matplotlib/Multiple_Subplots_with_One_Axis_Label>
    subplot(1,2,2);title("2"); imshow(images[1])
    show()
    def grayLevelMap2(I,vector):
    I = np.copy(I)
    (x,y) = np.shape(I)
    for i in range(x):
    for j in range(y):
    I[i,j] = vector[I[i,j]]
    return I
    def grayLevelMap(I,a,b):
    I = np.copy(I)
    (x,y) = np.shape(I)
    for i in range(x):
    for j in range(y):
    I[i,j] = max(0,min(255, a* I[i,j] +b))
    return I
    def grayLevelMapF(I,f):
    I = np.copy(I)
    (x,y) = np.shape(I)
    for i in range(x):
    I[i] = map(f, I[i])
    return I
    def f(x):
    func=(255-x)*math.sqrt(100/+1)
    return func
    def displayVect(v):
    points=[]
    for i in range(len(v)):
    points.append((i,v[i]))
    bins = range(256)
    n = v
    grid(None, 'major', 'both')
    plot(bins, n, 'k-', linewidth=5)
    axis([-2, 256, -2, 256])
    show()
    ##------------------------------Main body
    # Loading an image using openCV
    I1=cv2.imread("children.tif")
    I2=cv2.imread("Eye1.jpg")
    I3=cv2.imread("Marker1.jpg")
    I4=cv2.imread("GreenTest.jpg")
    I1=cv2.cvtColor(I1, cv2.COLOR_RGB2GRAY)
    I2=cv2.cvtColor(I2, cv2.COLOR_RGB2GRAY)
    I3=cv2.cvtColor(I3, cv2.COLOR_RGB2GRAY)
    I4=cv2.cvtColor(I4, cv2.COLOR_RGB2GRAY)
    show1_OpenCV(grayLevelMapF(I4, lambda x: 2*x))
    show1_OpenCV(I4)
    gdb output:
    Reading symbols from /usr/bin/python2...(no debugging symbols found)...done.
    (gdb) run Test1.py
    Starting program: /usr/bin/python2 Test1.py
    warning: Could not load shared library symbols for linux-vdso.so.1.
    Do you need "set solib-search-path" or "set sysroot"?
    [Thread debugging using libthread_db enabled]
    Using host libthread_db library "/usr/lib/libthread_db.so.1".
    Traceback (most recent call last):
    File "/usr/share/gdb/auto-load/usr/lib/libgobject-2.0.so.0.3400.3-gdb.py", line 9, in <module>
    from gobject import register
    File "/usr/share/glib-2.0/gdb/gobject.py", line 3, in <module>
    import gdb.backtrace
    ImportError: No module named backtrace
    [New Thread 0x7fffdf259700 (LWP 3981)]
    Program received signal SIGSEGV, Segmentation fault.
    0x00000036cb6bc785 in do_warn () from /usr/lib/libpython2.7.so.1.0
    (gdb) backtrace
    #0 0x00000036cb6bc785 in do_warn () from /usr/lib/libpython2.7.so.1.0
    #1 0x00000036cb6bcc4c in PyErr_WarnEx () from /usr/lib/libpython2.7.so.1.0
    #2 0x00007fffe10153b5 in ?? ()
    from /usr/lib/python2.7/site-packages/gobject/_gobject.so
    #3 0x00000036cde4ea00 in g_logv () from /usr/lib/libglib-2.0.so.0
    #4 0x00000036cde4ebf2 in g_log () from /usr/lib/libglib-2.0.so.0
    #5 0x00007fffef8e009e in gtk_disable_setlocale ()
    from /usr/lib/libgtk-x11-2.0.so.0
    #6 0x00007ffff21ca8a8 in cvInitSystem ()
    from /usr/lib/libopencv_highgui.so.2.4
    #7 0x00007ffff21cafb3 in cvNamedWindow ()
    from /usr/lib/libopencv_highgui.so.2.4
    #8 0x00007ffff352dfe4 in ?? () from /usr/lib/python2.7/site-packages/cv2.so
    #9 0x00000036cb6dc05a in PyEval_EvalFrameEx ()
    from /usr/lib/libpython2.7.so.1.0
    #10 0x00000036cb6dba83 in PyEval_EvalFrameEx ()
    from /usr/lib/libpython2.7.so.1.0
    #11 0x00000036cb6dcedd in PyEval_EvalCodeEx ()
    ---Type <return> to continue, or q <return> to quit---j
    from /usr/lib/libpython2.7.so.1.0
    #12 0x00000036cb6dcfb2 in PyEval_EvalCode () from /usr/lib/libpython2.7.so.1.0
    #13 0x00000036cb6f5eea in run_mod () from /usr/lib/libpython2.7.so.1.0
    #14 0x00000036cb6f6ce2 in PyRun_FileExFlags ()
    from /usr/lib/libpython2.7.so.1.0
    #15 0x00000036cb6f76fb in PyRun_SimpleFileExFlags ()
    from /usr/lib/libpython2.7.so.1.0
    #16 0x00000036cb7089f2 in Py_Main () from /usr/lib/libpython2.7.so.1.0
    #17 0x00007ffff7833a15 in __libc_start_main () from /usr/lib/libc.so.6
    #18 0x0000000000400741 in _start ()
    (gdb) run Test1.py
    Last edited by Bladtman242 (2013-02-12 19:30:55)

    Thank you so much!
    This (ugly) workaround might have saved me from a lot of trouble with this semester's exercises.
    Just to be clear, the solution is:
    Import opencv
    Call namedWindow
    Do the rest of the imports
    Do the rest of the script
    Corect?
    I'm hesitant to mark the thread as solved, as the problem is still very much there, this workaround just allows us to ignore it.

  • How to know the dynamic values for this :AND category_id_query IN (1, :3, )

    Hi Team,
    R12 Instance :
    Oracle Installed Base Agent User Responsibility --> Item Instances -->
    Item Instance: Item Instances > View : Item Instance : xxxxx> Contracts : Item Instance : xxxxx> Service Contract: xxxxx>
    In the above page there are two table regions.
    Notes.
    -------------------------------------Table Region---------------------------
    Attachments
    -------------------------------------Table Region---------------------------
    --the attachments are shown using the query from the fnd_lobs and fnd_docs etc...
    I want to know what are the document types are displayed in this page ?
    --We developed a custom program to attach the attachments to the  services contracts and the above seeded OAF page displays those ..as needed.
    But after recent changes..the Attachments--> table region is not showing the attachments.
    I have verified the query..and could not find any clue in that..
    but i need some help if you guys can provide..
    SELECT *
    FROM
    *(SELECT d.DOCUMENT_ID,*
    d.DATATYPE_ID,
    d.DATATYPE_NAME,
    d.DESCRIPTION,
    DECODE(d.FILE_NAME, NULL,
    *(SELECT message_text*
    FROM fnd_new_messages
    WHERE message_name = 'FND_UNDEFINED'
    AND application_id = 0
    AND language_code  = userenv('LANG')
    *), d.FILE_NAME)FileName,*
    d.MEDIA_ID,
    d.CATEGORY_ID,
    d.DM_NODE,
    d.DM_FOLDER_PATH,
    d.DM_TYPE,
    d.DM_DOCUMENT_ID,
    d.DM_VERSION_NUMBER,
    ad.ATTACHED_DOCUMENT_ID,
    ad.ENTITY_NAME,
    ad.PK1_VALUE,
    ad.PK2_VALUE,
    ad.PK3_VALUE,
    ad.PK4_VALUE,
    ad.PK5_VALUE,
    d.usage_type,
    d.security_type,
    d.security_id,
    ad.category_id attachment_catgeory_id,
    ad.status,
    d.storage_type,
    d.image_type,
    d.START_DATE_ACTIVE,
    d.END_DATE_ACTIVE,
    d.REQUEST_ID,
    d.PROGRAM_APPLICATION_ID,
    d.PROGRAM_ID,
    d.category_description,
    d.publish_flag,
    DECODE(ad.category_id, NULL, d.category_id, ad.category_id) category_id_query,
    d.URL,
    d.TITLE
    FROM FND_DOCUMENTS_VL d,
    FND_ATTACHED_DOCUMENTS ad
    WHERE d.DOCUMENT_ID = ad.DOCUMENT_ID
    *) QRSLT*
    WHERE ((entity_name    ='OKC_K_HEADERS_V'-- :1
    AND pk1_value          IN ( 600144,599046) --:2
    AND category_id_query IN (1, :3, :4, :5, :6, :7) )
    AND datatype_id       IN (6,2,1,5)
    AND (SECURITY_TYPE     =4
    OR PUBLISH_FLAG        ='Y')))
    --='000180931' -- 'ADP118'
    The above seeded query is the one which is used for table region to retrieve the data..
    how to know the dynamic values for this : AND category_id_query IN (1, :3, :4, :5, :6, :7) )
    --Sridhar                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Hi Patricia,
    is it working for restricted key figure and calculated key figure ??
    Note Number Fisc Period Opening Days
    1 1 2
    2 1 3
    3 1 0
    because I have other restriction, so I create two restricted key figure..
    RK1  with restriction :  Total Number of Note,
    RK2  with restriction :  Total Opening Days ,
    then I Created a calculated key figure, average opening days in a period
    CK1 = RK2 / RK1..
    in this case, I am not sure if it will work or not..
    for example, during RK2 calclation, it might be this   2+3 = 5, the line with 0 will be ignored..
    during RK1 calcualtion, it might be 1 + 1 + 1 = 3. ---> Not sure in this case, the line with opening days 0 will be calculated or not..
    could you please confirm..

  • Need a report with po,gr,ir values in LC and FC

    Dear experts,
    How can I develop this custom report  with po,gr,ir values in LC and FC with the following fields
    is their any standard sap report .Please advise
    PO NO
    PO DATE
    GRNO
    GR DT
    IR NO
    IR DATE
    VENDOR
    GR VALUE
    IR VALUE
    GR VALUE IN FC
    IR VALUE IN FC
    STATUS
    Rgds
    Naresh

    After joining the tables, or in the info set screen where you join the tables, you will see 'Extras'.  Click on it and use the create tab to set up ZGR_MENGE (Goods Receipt) and ZIR_MENGE (Invoice Receipt). 
    Define long texts and header descriptions.  Under format, enter 'P' for type, '007' for output length and '03' for decimals.  Enter EKBE-MENGE for LIKE Reference.  Sequence for code '1'.
    For Invoice Receipt, enter 017 for output length.
    Just to add, your abaper would have to put in a simple code based on the specifications you give him/her.
    Best of luck.
    Elias

  • Is there an issue with Safari in Iphones and Ipads where it does not recognize option disabled="disabled" tag properly?

    Greetings everybody,
    I have run into a peculiar problem with the Safari browser on mobile platforms (IPad, IPod, IPhone) which I hope I can find a solution for in this community.
    I just launched a new website where I sell products with certain variations, for example please view this link: http://www.finerribbon.com/aegean-single-face-satin-ribbon.html
    On this particular product, the product variants are supposed to work in such a way where:
    If you select the value "1/8" from the "Choose Ribbon Width" field
    THEN
    The only options active and available to choose from in the "Choose Roll Size" field should be: 500 Yds, 20 Yds & Sample Swatch
    Now, if we browse to the above link in Safari using a desktop or a laptop, there are no problems at all, but if we browse to the above link using an IPhone or an Ipad, then we have a problem where all the options are available regardless of the values chosen, basically the above functonality does not work.
    I was told that there an issue with Safari in Iphones and Ipads where it does not recognize option disabled="disabled" tag properly, is this true? Can anyone advise me if there is a solution to this problem? I would sincerely apperciate it.
    Thank you very much for your time and help!

    Hi...
    I have run into a peculiar problem with the Safari browser on mobile platforms (IPad, IPod, IPhone) which I hope I can find a solution for in this community.
    Now, if we browse to the above link in Safari using a desktop or a laptop, there are no problems at all, but if we browse to the above link using an IPhone or an Ipad, then we have a problem
    At the top of this window you'll see the following:
    Apple Support Communities > Mac OS & System Software > Safari > Discussions
    This the Safari forum for the Mac OS X.
    Better that you post your topic here  > Developer Forums: Apple Support Communities
    I do see on my iPad what you are rreferring to. But you shoudl get the feedback you need in the developer forum.

  • How to populate SNDPRN with different values in Development and Production?

    Hello experts,
    I have to fill the field SNDPRN in the message mapping with a different value in Development and in Production. As I am new to PI, I used a simple solution - but it is rather ugly: I set a constant value in the mapping in development and a different one in Production. However, I would like to know if there is any solution to have a condition in the mapping like:
    IF system is Development, set SNDPRN as Constant1.
    IF system is Production, set SNDPRN as Constant2.
    (And eventually IF system is Test, set SNDPRN as Constant3)
    Thanks in advance for your help,
    Luis

    Hi Luis,
          You can go with the parameterized mapping , where you can provide different values for SNDPRN in interface determination for development and production.
          At point of time if you want to change the values it requires only a change in the configuration.
          Please refer the below links for reference;
    Parameterized Mapping Programs - Enterprise Services Repository - SAP Library
    http://scn.sap.com/people/jin.shin/blog/2008/02/14/sap-pi-71-mapping-enhancements-series-parameterized-message-mappings
         We are following the same approach for some of our interfaces.
    - Muru

  • Difference in value Between BW and R/3

    Hi,
    We are in the process of validating reports .We have observed difference in value for  credit and debit amount fields in BW and for the same data is coming from standard data source 0EN_JVA_2 (Joint venture: Actual line item using delta extraction) we have done full data load. In the data source the data coming from a single field and that is spited based on debit/credit indicator in R/3.
    I need your valuable input on where the problem is for the difference of value between BW and R/3.
    Regards
    Deva

    Hi,
    Is there any filterations in the IP?
    Any data packet deletion activity in the start rountine?
    Any update rule for the key figure is having routine with modifiying the value?
    Kindly check.
    In case if you are seeing number of records, always aggregation happens.
    If the difference is in terms of quantity of the key figure values, kindly check the above three.
    Kindly come to us so that we can suggest
    Hope this helps
    Janardhan Kumar

  • Measuring the value of "session_cached_cursors"  and "open_cursor"

    Friends ,
    Recently In my Database production server Oracle10g (version : 10.2.0.1.0.), I got the "open_cursor" and "session_cached_cursors" related error where OEM asks to increase the value . I have increase the value but the problem still is not solved .
    Can anybody plz tell me , how can I measure the Standard value of "open_cursor" and also "session_cached_cursors" of my database server ?
    Another question ,
    SQL> show parameter open_
    NAME TYPE VALUE
    open_cursors integer 500
    In above output , what is the unit of 500 value . Is this value related with the SGA memory area ?

    shipon_97 wrote:
    Thanks all for reply ..
    I have another query ...
    How can I find the standarnd value of "open_cursor" as well as "session_cached_cursors" parameter value in the respect of my oracle database server . And what are the recommended value of these parameters . I am using oracle database 10g (v-10.2.0.1.0 ) .Shipon,
    You can see the values of the parameters in your db with the simple show parameter command,
    >
    show parameter open_cursors
    show parameter session_cached_cursors>
    About the settings of the parameters and their optimal value, I guess there wont' be any "concrete" answer to that. Session cached cursors is set to 50 default in Oracle which means 50 cursors can be marked as 'hot cursors' for the the system and will be avoided from the library cache lookup. This also has a condition that the cursor will be marked as hot only when its run for 3 times. So you need to check back with your system that how many queries are actually requiring this optimization. And more over, this is used or said to be used when you are seeing a Library Cache Latch contention. I don't think that just for the sake of change, you need to modify the parameter from default.
    The smae is true for the OPEN_CURSORS as well. The value is required to be changed if you are seeing an error about maximum opened cursor exceeding from the set value. Generally , a value of 2000 is enough for most of the systems but again, that may depend on site to site and surely enough , you need to check yours befoe playing around.
    HTH
    Aman....

  • How to set value for trim and bleed using jsx script?

    I have tried using bleedoffsetRect but somehow it is not working. Can anybody please help me on how to set value for trim and bleed using jsx script? Any example will be highly appreciated.
    Following is the code I am trying with:
    var _saveName = new File ( root_path +_strFileName+".pdf");
        var _saveOpts = new PDFSaveOptions();
        _saveOpts.printerResolution = 300; 
        var bleedarray = new Array();
         bleedarray[0] =9.00;
         bleedarray[1]=9.00;
         bleedarray[2]=9.00;
         bleedarray[3]=9.00;    
        _saveOpts.bleedOffsetRect = bleedarray;

    I would expect although I've not actually tried this for the bleed off set box to be larger than the artbaord and the first two values to be negative or 0…
    var bleedarray = new Array(-9,-9,artboard.width+9,artboard.height+9);
    Where 'artboard.width' & 'artboard.height' you will have calculated from your file. An Array(9,9,9,9); would not constitute any boxes bounds.

  • Opening values for SFG and FG....Material master or MR21?

    hi Experts,
    when we upload the opening material values, from where will it pick the values for SFG and FG.....is it from Material Master or from MR21.
    I want to run Costing run and save the prices after uploading the opening balances and not before. will there be any financial entry if i run the costing run and save the prices and i m not making any material movements but just running the costing run.
    thanks in advance
    regards
    suresh

    Yes,thats correct. Refer the below documentation for more clarity.
    http://help.sap.com/saphelp_erp60_sp/helpdata/en/8f/d1de34e4cb2300e10000009b38f83b/frameset.htm
    Thanks and regards
    Kedar

  • Cumulate values in structure and formulas

    Hello,
    I created the query and used the structure in rows. But in this structure I can´t use the possibillity with cumulate values. Do you know where is the problem?
    I have also problems with cumulate data in formulas - I count rate of costs in two years. In colums I have cumulate values but the formula, which count the rate don´t use the cumulate values.
    Thanks for you advices.
    Regard
    Petr

    Thanks, but when I use the structure this opportunity don´t operate. Below you can see the result which I have, but I want to get cumulate date in colums and rate which operate with cumulate date.
    2006      2007     rate
    -9 905     -15 120     1,5266
    -19 294     -18 972     0,9833
    -3 931     -6 678     1,6988
    -11 821     -13 935     1,1788
    -16 144     -26 749     1,6569
    -17 075     -10 294     0,6029
    -22 715     -13 006     0,5726
    -9 648     -10 206     1,0579
    -1 631     -11 895     7,2953
    -16 231     -20 033     1,2343
    -14 489     -20 293     1,4005
    -19 365     -23 894     1,2339
    Thanks

  • Value based hierarchies and Period dimension should only contain  2  monthl

    Hi,
    Can anyone tell me what is value based hierarchies. And also can anyone let me know to filter the records in AWm because in Period dimension should only contain 2 monthly values (Oct. 2008 ( 200810) and Sep. 2008 ( 200809)) it is urgent.
    Thanks
    vel

    The following definition of a value hierarchy comes from the OLAP Developers guide
    "A hierarchy defined only by the parent-child relationships among dimension members. The dimension members at a particular distance from the base level do not form a meaningful group for analysis, so the levels are not named. For example, an employee dimension might have a parent-child relation that identifies each employee's supervisor. However, levels that group together first-, second-, and third-level supervisors and so forth may not be meaningful for analysis."
    Typically this means that the source table has two columns, VALUE and PARENT, where PARENT has a (logical) foreign key relationship to VALUE. An example may be EMPLOYEE, MANAGER, where MANAGERs are also EMPLOYEEs.
    The simplest way to restrict the values loaded into the AW is to define a SQL view over the source table with an appropriate WHERE condition. You then use AWM to map the dimension to your new view. E.g.
    CREATE OR REPLACE VIEW RESTRICTED_TIME AS
    SELECT *
    FROM TIME
    WHERE MONTH IN (200810, 200809)

  • Select just the values between min and max of an accumulated value over day

    Hello Forum,
    a value is accumulated over a day and over a period of time. Next day the value is reseted and starts again to be accumulated:
    with sampledata as (select to_date('09.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
                       select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 29 val from dual union all
                       select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 30 val from dual union all
                       select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
                       select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
                       select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 75 val from dual union all
                       select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 95 val from dual union all
                       select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 14 val from dual union all
                       select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 34 val from dual union all
                       select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 58 val from dual union all
                       select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 70 val from dual union all
                       select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual)
    select   ts, val
    from     sampledata
    order by ts asc;How should I change the select statement to skip all data sets before the first minimum and the duplicates after the maximum of a day in order to get such a result:
    TS     VAL
    09.09.12 06:12     23
    09.09.12 07:12     29
    09.09.12 08:12     30
    09.09.12 09:12     45
    09.09.12 10:12     60
    09.09.12 11:12     75
    09.09.12 12:21     95
    09.09.12 13:21     120
    09.09.12 14:21     142
    10.09.12 06:12     14
    10.09.12 07:12     34
    10.09.12 08:12     58
    10.09.12 09:12     70
    10.09.12 10:12     120
    10.09.12 11:12     142
    10.09.12 12:21     153Thank you

    This solution works perfectly when the accumulated value has its low and its high on the same day. But I found out :( , that there is also data, which has its low yesterday and its high today. For a better understandig of the case, there is a machine, wich is working over 3 Shifts with irregular start and end time. For example Shift1 cann start at 5:50 or at 7:15. The accumulated value of the worked time is accumuated for each shift extra. This solution works for the shift 1 (approximate between 06:00-14:00) and for the shift 2(approximate between 14:00-22:00), because there is the low and the high of the accumulated value on the same day. This solution does not work for the shif 3(approximate between 22:00-06:00), because the high of the accumulated value is or can be the next day.
    So the thread title should be: "Select just the values between min and max of an accumulated value over the same day(today) or over two successive days (yesterday and today)
    Sampledata for shift 1 or shift 2:
    {code}
    with sampledata as (select to_date('09.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
    select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 29 val from dual union all
    select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 30 val from dual union all
    select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
    select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
    select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 75 val from dual union all
    select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 95 val from dual union all
    select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 143 val from dual union all
    select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 144 val from dual union all
    select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 145 val from dual union all
    select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 147 val from dual union all
    select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 148 val from dual union all
    select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual)
    , got_analytics     AS
         SELECT ts, val
         ,     MIN (val) OVER ( PARTITION BY TRUNC (ts)
                        ORDER BY      ts DESC
                        )      AS min_val_after
         ,     CASE
              WHEN ROW_NUMBER () OVER ( PARTITION BY TRUNC (ts)
                             ORDER BY     val
                             ,      ts     
                             ) = 1          
              THEN -1 -- Impossibly low val
              ELSE LAG (val) OVER ( PARTITION BY TRUNC (ts)
                             ORDER BY      ts
              END           AS prev_val
         ,     MIN (val) OVER (PARTITION BY     TRUNC (ts))
                        AS low_val_today
         ,     NVL ( LAST_VALUE (val) OVER ( ORDER BY ts
                             RANGE BETWEEN UNBOUNDED PRECEDING
                                  AND     ts - TRUNC (ts) PRECEDING
              , -1
              )          AS last_val_yesterday
         FROM sampledata
    SELECT     ts
    ,     val
    FROM     got_analytics
    WHERE     val          <= min_val_after
    AND     val          > prev_val
    AND     (      val     > low_val_today
         OR     val     != last_val_yesterday
    ORDER BY ts
    {code}
    with the expected results:
    {code}
    1     09.09.2012 06:12:02     23
    2     09.09.2012 07:12:03     29
    3     09.09.2012 08:12:04     30
    4     09.09.2012 09:12:11     45
    5     09.09.2012 10:12:12     60
    6     09.09.2012 11:12:13     75
    7     09.09.2012 12:21:24     95
    8     09.09.2012 13:21:26     120
    9     09.09.2012 14:21:27     142
    10     10.09.2012 06:12:02     143
    11     10.09.2012 07:12:03     144
    12     10.09.2012 08:12:04     145
    13     10.09.2012 09:12:11     146
    14     10.09.2012 10:12:12     147
    15     10.09.2012 11:12:13     148
    16     10.09.2012 12:21:24     153
    {code}
    And the sampledata for shift 3 is:
    {code}
    with sampledata as (select to_date('08.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union ALL
    select to_date('08.09.2012 02:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
    select to_date('08.09.2012 05:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 78 val from dual union all
    select to_date('08.09.2012 06:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 08:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 10:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 12:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 16:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 17:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 19:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 21:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 22:00:12', 'dd.mm.yyyy hh24:mi:ss') ts, 24 val from dual union all
    select to_date('08.09.2012 22:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 40 val from dual union all
    select to_date('08.09.2012 23:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 68 val from dual union all
    select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 79 val from dual union all
    select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 124 val from dual union all
    select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 125 val from dual union all
    select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 126 val from dual union all
    select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union ALL
    select to_date('09.09.2012 22:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 5 val from dual union ALL
    select to_date('09.09.2012 22:51:33', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
    select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 40 val from dual union all
    select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 50 val from dual union all
    select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
    select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 78 val from dual union all
    select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 145 val from dual union all
    select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual)
    , got_analytics AS
    SELECT ts, val
    , MIN (val) OVER ( PARTITION BY TRUNC (ts)
    ORDER BY ts DESC
    ) AS min_val_after
    , CASE
    WHEN ROW_NUMBER () OVER ( PARTITION BY TRUNC (ts)
    ORDER BY val
    , ts
    ) = 1
    THEN -1 -- Impossibly low val
    ELSE LAG (val) OVER ( PARTITION BY TRUNC (ts)
    ORDER BY ts
    END AS prev_val
    , MIN (val) OVER (PARTITION BY TRUNC (ts))
    AS low_val_today
    , NVL ( LAST_VALUE (val) OVER ( ORDER BY ts
    RANGE BETWEEN UNBOUNDED PRECEDING
    AND ts - TRUNC (ts) PRECEDING
    , -1
    ) AS last_val_yesterday
    FROM sampledata
    SELECT ts
    , val
    FROM got_analytics
    WHERE val <= min_val_after
    AND val > prev_val
    AND ( val > low_val_today
    OR val != last_val_yesterday
    ORDER BY ts
    {code}
    with the unexpected results:
    {code}
    - ts val
    1     08.09.2012 00:04:08     23
    2     08.09.2012 22:12:13     40
    3     08.09.2012 23:21:24     68
    4     09.09.2012 22:21:33     5
    5     09.09.2012 22:51:33     23
    6     09.09.2012 23:21:33     40
    7     10.09.2012 00:04:08     50
    8     10.09.2012 01:03:08     60
    9     10.09.2012 02:54:11     78
    10     10.09.2012 03:04:08     142
    11     10.09.2012 04:04:19     145
    12     10.09.2012 05:04:20     146
    {code}
    The result should be:
    {code}
    - ts val
    1     08.09.2012 00:04:08     23
    2     08.09.2012 02:04:08     45
    3     08.09.2012 05:03:08     78
    4     08.09.2012 06:54:11     90
    5     08.09.2012 22:00:12     24
    6     08.09.2012 22:12:13     40
    7     08.09.2012 23:21:24     68
    8     09.09.2012 01:03:08     79
    9     09.09.2012 02:54:11     124
    10     09.09.2012 03:04:08     125
    11     09.09.2012 04:04:19     126
    12     09.09.2012 05:04:20     127
    13     09.09.2012 22:21:33     5
    14     09.09.2012 22:51:33     23
    15     09.09.2012 23:21:33     40
    16     10.09.2012 00:04:08     50
    17     10.09.2012 01:03:08     60
    18     10.09.2012 02:54:11     78
    19     10.09.2012 03:04:08     142
    20     10.09.2012 04:04:19     145
    21     10.09.2012 05:04:20     146
    {code}
    Thank you for your help!

Maybe you are looking for

  • Problems with Windows 7 & jinitiator 1.3.1.26

    Hi everyone, I'm having a problem with jinitiator 1.3.1.26 on Windows 7 64-bit. I install jinitiator without problems but every time that I want to open my application the browser (IE8 or firefox 3.5) hangs and close with an error. Any idea? I tried

  • Can anyone help with Interactive docs in ID CS5?

    I am creatiing an ineractive document in ID CS5 and it is going well. However, suddenly my Object States panel has stopped functioning - I select an object and the panel is blank, the menu items are greyed out and there are no icons at the bottom of

  • User exit for save_document in delivery

    hello! What is the include for the user exit of save document in a delivery? Thx

  • Can I call a C function from Java (using JNI) ??

    hello, I need to call one C function from remote device and get its return value(o/p) (for my application it is device SSID ) and display in the client PC. Is it possible in java without using SNMP connction?. please give me idea about it. I need thi

  • Sync mail to rfc scenario

    Hi, I've implemented a simple scenario in which incoming mail msg is "pushed" to R/3 by rfc (i.e., mail to rfc). is it possible to config this scenario so returned answer (from rfc call) will be send back to the original mail sender? Regards Uri