How is graph data resampled to pixels?

As a test, I set up two identical graphs with plot areas 500 pixels wide, and a noisy sine wave of 1000 samples.  One graph displayed the straight waveform, while another got the results of using the Decimate Arrray function to halve the number of samples.  They are not the same.
So what exactly is a waveform graph doing when it dispalys a data set larger than its pixel width?
Solved!
Go to Solution.

Darin.K wrote:
As you have noticed, the graph is a bit more clever than you expected.  When the number of points exceeds the number of pixels, some form of resampling must take place.  A simple decimation like you tried would potentially drop some interesting points.  Instead, outliers are given priority when selecting which points to display.  This way, sharp peaks and dips are displayed even on a large scale.  Reproducing this behavior on your own can be tricky.  And once you reduce the data set, the points are gone and zooming is not as effective.
I usually try to let the graph do as much as possible.  One simple and effective way to reduce the size of data you have to ship over the network is to use SGL precision instead of DBL precision.
Megga-Dittos!
And if the native graph/chart is having trouble...
Check the plot styles...
Reduce the frequency of updates...
Shut-off auto-sclae....
Defer FPupdate before and undefer after the update.
Ben
Ben Rayner
I am currently active on.. MainStream Preppers
Rayner's Ridge is under construction

Similar Messages

  • How old historical data can be available for SCOM performance views to show up in graphs?

    I have created performance rule based on performance counter and created a performance view on the target of the rule. It shows me the graph of how performance counter value changed over the period of time.
    In SCOM console there is an option in performance views, to choose start date-time and end date-time for the performance graph. It allows me to select a range of say few years.
    However going by the Microsoft documentation, it appears that performance data is stored in OperationsManagerDW database only for limited period.
    I have a feature requirement where we need to have performance views or reports or some means where for compliance purpose we should be able to show older data as well.
    So my query is exactly how much historical data(up to how old) is actually available for performance views? Is there any specific information or MS documentation available where I can get this information (duration of historical data in specific number of days
    or years or so.) I need to decide based on this if I can use performance views or I need to go for some kind of reports like SSRS reports instead.

    Hi,
    Additionally, I would like to share the following article with you. Hope it helps.
    Understanding and modifying Data Warehouse retention and grooming
    http://blogs.technet.com/b/kevinholman/archive/2010/01/05/understanding-and-modifying-data-warehouse-retention-and-grooming.aspx
    Niki Han
    TechNet Community Support

  • How do I write a vi that will save graph data and text data (related to the graph) so the next time I want to view the graph data the text data is included in the read vi?

    I am new at writing vi's and hope you may be able to help. I would like to create a vi that will graph measurements taken from a daq device. I would like to include text data that a user can choose from (example: machine number, test circuit, load cell type) that will stay with the graph so when the graph is viewed at a later time the text data (explaining parts of the graph) will display with the graph data. I have included a vi I am using to capture and display a force value. Any help would be greatly appreicated.
    Attachments:
    force.vi ‏500 KB

    What you want is a DATALOG file:
    When you save a file, use the BUNDLE function to bundle your machine number, test circuit, whatever (include a few spare fields), plus your graph data. Get the graph data from the source, or use a local variable of the graph itself.
    Wire the bundle output to the DATALOG TYPE of a NEW FILE function. (I presume you'll use a FILE DIALOG set to SAVE FILE to choose a file path).
    Write the same bundle output to the DATA input of a WRITE FILE function.
    Use a CLOSE FILE function to (ahem) close the file.
    When you want to read a file, use a FILE DIALOG set to EXISTING FILE (or some other means) to specify what file to read. Wire the same cluster type to the DATALOG TYPE of the FILE DIALOG, so that it will only
    offer files of the correct type.
    When you have a file path chosen, wire the bundle to the DATALOG TYPE input of a FILE OPEN operation.
    Use a FILE READ to read a single cluster - the output of FILE READ will be a cluster of the right type.
    Use a CLOSE FILE function to....
    Out of the FILE READ function, you can UNBUNDLE BY NAME the data and send to the graph and the other fields, or, if you're clever, you can use a cluster on the screen, and not unbundle it. That's harder though, since you probably want the text fields to be controls (inputs) and the graph to be an indicator (output).
    Hope that helps.
    Steve Bird
    Culverson Software - Elegant software that is a pleasure to use.
    Culverson.com
    Blog for (mostly LabVIEW) programmers: Tips And Tricks

  • How to print date & time in photos

    Thanks for the help.
    I need to print either photos with time and dates in them... or the Bridge contact sheet with date and time info.
    The final printed image will be used in court to photo document evidence.
    Do I have to do a screen capture or is there some other way?
    My camera will not allow me to imprint time and date.
    I'm using CS4 on both Windows and Mac.
    Thanks again in advance for the taking the time to help.
    Dances With Pixels
    I may have sent this twice, but I'm new to this and this is my first time trying to post.

    Thank you for taking the time to respond.  Much appreciated. 
    And will follow up on your suggestions.
    Date: Tue, 10 Aug 2010 23:55:31 -0600
    From: [email protected]
    To: [email protected]
    Subject: How to print date & time in photos
    I think there are (various) Scripts about for that.
    For example addExifDate1.5.jsx:
    http://www.adobe.com/cfusion/exchange/index.cfm?event=extensionDetail&extid=1346521
    One should be able to adapt it to include the time.
    You could ask in the Photoshop Scripting Forum or http://www.ps-scripts.com/
    >

  • How to get data into the mySQL database?

    First some background.
    I have a website that has outgrown its designed dimensions and is a huge burden to maintain. See PPBM5 Benchmark
    There is a lot of maintenance work involved, so I'm investigating a PHP/MySQL approach to easen the burden and to add functionality to the site. With the current Excel based structure and over 420 entries, it is cumbersome for me to maintain, but also for users to find what they need.
    A MySQL based dynamic structure is a lot easier and offers vastly more selection capabilities, like selecting only records that meet specific criteria.
    Data submission is done with a form, that contains most of the relevant data, but the drawack is that people submitting their data are often not technically inclined, give wrong answers due to a lack of understanding or making typo's. The test results are attached in one or two separate .txt files, but often they have not read the instructions correctly or did something wrong, so these attached .txt files can not be trusted automatically, they have to be checked before inclusion.
    These were my initial thoughts:
    1. Data collection:
    To avoid spending all our energy and time  on correcting typo's, getting missing data, correcting errors, I am  investigating the use of CPU-Z in Ghost mode to create a .txt or .html  file that contains all relevant hardware info we need and even more. It gives all the info we currently have, but adds  data like number of memory sticks, DDR timings, stock clock speed and  BCLK setting, video card info and VRAM size, etc.
    To see what I mean, run CPU-Z, go to the About tab and press the Save Report button and look at the results.
    This can all be done without user intervention in an automatic way, but  maybe I need to add an Auto-It file to the test to make it all run as  desired.
    If this works and I'm able to extract the relevant data from the created  file and can insert it into the database, we may be in business for the  next version of PPBM5.5 or PPBM6. It does require a modification to the instructions, making them a lot  easier, because there is less data to fill out.
    2. Data submission:
    The submission form can be simplified if  the CPU-Z data can be used. We have to create an automatic way to attach  the created .html file from CPU-Z to the submission form and we have to  streamline the Output.txt and Output-MPE.txt files to be more easily included in the 'form.lib.php' file. It  currently is manual labor and very time consuming.
    3. Adding to Database:
    I have to find a way to create database  records from the Gmail forms I receive. All incoming mail messages need  to be checked on relevancy and if relevant, need to be added  automatically to the database and then offered for approval before final inclusion in the database. Data included in the database  will then include submission date and time, Email address,  IP address  used, plus links to the files submitted and available on the website.
    4. Publication of the database:
    After approval of new records from step  3, all updates will be automatically applied to the database and  accessible for users. I do not yet intend to introduce a user account ,  requesting login before all functionality is accessible. Too much trouble and administration.
    Queries should be possible on things like CPU (check box), so include  17-920, i7-930, i7-950 but exclude i7-980X and i7-990X, Size of memory  (check box), Overclocked (boolean, yes, no), SSD as OS disk, and similar  options.
    The biggest problem is to keep the color grading and statistical  indicators (Top, D9, Q3, Med, Q1 and D1) intact on dynamically generated  queries. Say you make a query which results in 20 observations, this  should show the related colors and legends. Next query results in 48 observations and of course the color grading and legends  do need to reflect that. Question in my mind, does the RPI remain  constant, independent of the query or does that need to be recalculated  on the basis of the query?
    Next thing is to allow a user to select a specific observation and by  simply clicking on it be shown, in a separate window (detail page) or  accordion, all the CPU-Z related information about the hardware.
    The graphs, Top-20 and MPE Gains, need to be dynamically adjusted, based on the query used.
    5. Ideally, external links:
    In an ideal situation, one could link the  CPU-Z data to external price databases, looking up current prices for  CPU, memory, video card, disks, raid controller, etc. to get instant  BFTB charts, based on the query made. But that is the next step.
    Situation now:
    I have a MySQL database that is easily updated with the new submissions. Simply create a .CSV flie from the submitted forms and import that into the database. The bulk of the initial work is done.Lots remain to be done as you can see above, but that is for a later time.
    Question:
    I have this table, that needs to be filled with data in the submitted and attached files. Mr. X submitted his data and can be uniquely identified by his "Ref_ID". He attached one or two files in .TXT format with the relevant test data. These files are stored on the server with a concatenated name:
    "Ref_ID","-","filename"
    Say his Ref-ID is: 20110204-6cf5 and his submitted file is called: Output(99).txt then the file can be found on the server as
    20110204-6cf5-Output(99).txt
    I need to be able to open that comma delimited file, the contents may look like this: "439","1036","819","531" and insert these contents into the relevant record and fields.
    Graphically,
    is what I want to achieve.
    This being my first exposure to PHP/MySQL, you can imagine I'm not clear on how to go from here.
    Added complication is that I actually have 5 numbers to insert per record and two calculated fields, Total Score and RPI should be calculated fields. Haven't yet figured out how to handle calculated fields, maybe only in the PHP/HTML code and not in the database.
    I hope someone can help me.

    You do have a very complex looking site and may need several tables in mysql to handle all that data. If you knew to phpmysql I would suggest taking a look at this tutorial it will help get you started in understanding how to $_GET info from a database and also how to $_POST data to a database. I am no expert just learning myself and I found this very helpful. This is the link http://www.adobe.com/devnet/dreamweaver/articles/first_dynamic_site_pt1.html
    There are also many tutorials on Youtube to help build a CMS Content Management Site I would suggest the following: -
    http://www.youtube.com/user/phpacademy
    http://www.youtube.com/user/betterphp
    http://www.youtube.com/user/flashbuilding
    And many more on my channel here
    http://www.youtube.com/user/Whisperingonthewind
    CMS's are easier to maintain, add edit and delete content.
    I have also recently bought a Book by David Powers Training from the Source very helpful.
    Anyway hope you get it sorted.

  • Graph Data Labels in OBIEE 11g - Customizing

    Hi folks!
    Would somebody know how to select what is shown in graph data labels in OBIEE 11g?
    For example in a scatter graph data label I'd like to show the Group only (not the combination of Series, Group, X and Y, as by default). For a pie chart one can select the data shown in data labels to some extent (name / value), however couldn't find this option for scatter graph.
    After searching I could only find instructions for 10g - such as John's instructions at http://obiee101.blogspot.com.es/2008/01/obiee-xy-and-data-in-mouse-over-label.html.
    However, my understanding is that the graph engine in OBIEE 11g is now different, and the 10g instructions are not valid for 11g anymore.
    I was also having a look on the .cxml files in OBIEE 11g but those seem to relate to colors only?
    Any help appreciated, best regards,
    Ilmari

    hello,
    i'm facing the same situation here,did you find a solution for your problem?

  • How do I disable automatic hot pixel correction in ACR?

    OK - I'm really conused here...
    I just got a Nikon D800E and on my first day of shooting I noticed a hot pixel spot in my images in Bridge.  The when I opened the files in ACR, the hot pixel spot was gone.  Apparently this is a "feature" of ACR: it automatically replaces hot pixels with RGB values from neighbring pixels.  (So those of you who think you have no hot pixels, think again  - you might be shielded from the truth!  I find this fact very disconcerting, but that's a separate issue...).
    The problem is that I can't tell how many hot pixels there are.  Based on the image in Bridge, it would have to be spot of 20 - 30 pixels (maybe even more).  That's unacceptable to me, especially on a $3300 camera.  Sure, maybe a few hot pixels spread around the image, but 20-30 bunched in that one spot, that's unacceptable.
    However, I can't figure out how to disable the automatic hot pixel correction in ACR, so I don't know if the issue really is 20 - 30 hot pixels or if Bridge is just doing some type of sub-sampling that makes the problem look a lot worse than it is.  Furthermore, this concerns me greatly because I've been using a D200 for many years and have never seen a single hot pixel issue.  So that says to me that the sensor on my D800E has a bigger hot pixel issue than it should (I realize all sensors have some hot/dead pixels).
    So, any help on how to go about figuring this out?  The simplest solution is to disable the automatic hot pixel correction in ACR, but I can't figure out how to do that.  I'm guessing it's not possible.
    Thanks,
    rgames

    MikeKPhoto wrote:
    …I was not aware and I have searched the ACR documentation and cannot find a reference, maybe you can point me to where this "well known for years" information is located…
    Sorry, I' wouldn't presume to embark on a Google search for you, as I'm sure you can do that yourself, MikeKPhoto..
    What I can tell you, without a question, is that it was discussed at length in these forums during the earliest versions of ACR eight or nine years ago or so, and I remember participating in a discussion of the feature myself with other Pentax users in the Pentax SLR Talk forum on DPReview around 2003 or 2004.
    I found one such message from 2006 (see below) but I'm sure I was involved in earlier discussions a few years earlier:
    http://forums.dpreview.com/forums/read.asp?forum=1036&message=19247067
    Forum 
    Pentax SLR Talk
    Subject 
    Re: As my istD gets older...  [SIMILAR]
    Posted by 
    Zaldidun
    Date/Time 
    2:09:06 PM, Tuesday, July 18, 2006 (GMT)
    Interesting. It's possible that my camera does have a few bad pixels, but I'd never see them because I shoot RAW exclusively and Adobe Camera Raw maps them out on the fly.
    One of these days, when I'm feeling masochistic, I guess, I'll try the Pentax software to convert a test image. Or maybe not. 
    (emphasis added)

  • Graph Data from Data Table

    I have a Table Data which 5 columns, 1 string column and 4 double columns with header name.  I try to graph the data in excel based on example from NI. When I use the data from NI example instead of my Data Table then it works.  I research on convert 1D to 2D Array but I could not be able to do it.

    LabViewRV wrote:
    I fixed the typos. How can I convert it into 2D Array, I tried the reshape accordingly to some old questions but it did not work for me. 
    A Reshape Array can be used. You can also simply feed a 1D array into a Build Array function. That will give you a 2D array, but I do not believe that is what you should do here:
    By the way, it there any possible way to graph data in LabView?
    Again, the Data Table contains 5 columns, 1st column data type is string (timestamp), 2nd, 3rd, 4th, and 5th are double
    There are numerous ways to graph data in LabVIEW. Have you looked at the shipping examples and in the LabVIEW Help, or in the controls palette? You have a waveform chart, a waveform graph, and an XY graph. You can also have picture graphs. Is the source data a text file or an Excel workbook? If it's a text file you can use Read From Spreadhsheet File to read the file into a 2D array of strings. You can then convert the 1st column into time information, and the remaining columns into numerics. Please take a look at the examples. If you are still having problems, then post back what you have tried, and please provide an example of the data file you are trying to read and graph.

  • Implementing a Graph Data Structure

    First time posting here, I was introduced to LabVIEW while participating in FIRST.
    I was reading about Graph Data Structures, and wanted to see if I could use them in LabVIEW, as it looked like a good way to relate information. Unfortunately, it appears that the straight forward way that I attempted will not work. I tried creating a typedef which consisted of a cluster of two elements, an array of linked nodes and an array of edge weights. Alas, I found you can't have recursive data types, as the array of linked nodes in the typedef needed to be the same as the typedef. I know why this is after a bit of searching, but I was wondering if there was a way to get around this. From my research, it seems like using a root class and a child class is one possible, but advanced way of doing it.
    I am currently thinking of just representing the linked nodes of a node as an array of index numbers which you can use to get the referenced nodes from the graph array. This means that the graph array cannot be sorted or otherwise modified so that the index numbers won't work, you can only add objects onto the end. Is this thinking right, or is there a different way to go about this?
    Solved!
    Go to Solution.

    Not an easy problem, as recursion is not native to LV programming (much less than any other language I used except assembler).
    But the solution to your index number problem is to use 'keys'. Each node should have a unique key (number), so you can adress it (search for the key) even if the array is sorted in some way.
    Again, it is not native to LV. This means, that you need to program the logic other languages use for that on your own (pseudo-pointers, handles, keys, hashes).
    From a practical side: I would not implement the graph in LV itself but access a graph from some other source (data base?) via an interface (ActiveX, .NET). To learn a bit more about how to do it, try playing around with the tree control (it is a limited graph).
    Felix
    www.aescusoft.de
    My latest community nugget on producer/consumer design
    My current blog: A journey through uml

  • Req. for Graph Data set

    Hi all,
    I am in need of a graph data set with the format as: graph name, x coordinate and y coordinate.
    I will appreciate if anybody having this, pl. send it to me. Send some help like, the link address where I can download or an idea how to generate it.
    Thanks in advance,
    Shaw.

    Thank you for your quick response.  I have tried this and got two tables but only input per given time.  Her is an example below of what data needs to be inputted. 
    Todays date
    112.5
    0.004758855
    225
    0.003022459
    337.5
    0.005110742
    450
    0.006503629
    562.5
    0.008989879
    675
    0.007626316
    787.5
    0.003847718
    900
    0.00263287
    1012.5
    0.001321671
    1125
    0.00139917
    1237.5
    0.00314185
    1350
    0.006394711
    1462.5
    0.012263686
    1575
    0.010973433
    1687.5
    0.001998216
    1800
    0.001878826
    1912.5
    0.002599357
    There are 3600 lines of this...all grouped in one moment in time.

  • How to populate data in table control  .

    hi all,
    i put matnr no. in screen no. 103
    validation is done at that screen only.
    now when i want to modify dat record
    when i put matnr no. at screen 103
    so how i will get all  data of dat number to table control screen.

    Hi Darshan,
       Here is a detailed description of how to update data in table controll.
      Updating data in table control
    The ABAP language provides two mechanisms for loading the table control with data from the internal table and then storing the altered rows of the table control back to the internal table.
    Method 1: Read the internal table into the Table Control in the screenu2019s flow logic.  Used when the names of the Table Control fields are based on fields of the internal table.
    Method 2: Read the internal table into the Table Control in the module pool code. Used when the names of the Table Control fields are based on fields of the database table.
    Method 1 (table control fields = itab fields)
    In the flow logic we can read an internal table using the LOOP statement. Define the reference to the relevant able control by specifying WITH CONTROL <ctrl>
    Determine which table entry is to be read by specifying CURSOR <ctrl>-CURRENT_LINE.
    After the read operation the field contents are placed in the header line of the internal table. If the fields in the table control have the same name as the internal they will be filled automatically. Otherwise we need to write a module to transfer the internal table fields to the screen fields.
    We must reflect any changes the user makes to the fields of the table control in the internal table otherwise they will not appear when the screen is redisplayed after PBO processing, (eg, after the user presses Enter or scrolls) However, this processing should be performed only if changes have actually been made to the screen fields of the table control (hence the use of the ON REQUEST)
    PROCESS BEFORE OUTPUT.
    LOOP AT ITAB_REG WITH CONTROL TCREG
    CURSOR TCREG-CURRENT_LINE.
    ENDLOOP.
    PROCESS AFTER INPUT.
    LOOP AT ITAB_REG.
    MODULE MODIFY_ITAB_REG.
    ENDLOOP.
    MODULE MODIFY_ITAB_REG INPUT.
    MODIFY ITAB_REG INDEX TCREG-CURRENT_LINE.
    ENDMODULE.
    Method 2 (table control fields = dict. fields)
    If using a LOOP statement without an internal table in the flow logic, we must read the data in a PBO module which is called each time the loop is processed.
    Since, in this case, the system cannot determine the number of internal table entries itself, we must use the EXIT FROM STEP-LOOP statement to ensure that no blank lines are displayed in the table control if there are no more corresponding entries in the internal table.
    PROCESS BEFORE OUTPUT.
    LOOP WITH CONTROL TCREG.
    MODULE READ_ITAB_REG.
    ENDLOOP.
    PROCESS AFTER INPUT.
    LOOP WITH CONTROL TCREG.
    CHAIN.
    FIELD: ITAB_REG-REG,
    ITAB_REG-DESC.
    MODULE MODIFY_ITAB_REG
    ON CHAIN-REQUEST.
    ENDCHAIN.
    ENDLOOP.
    MODULE READ_ITAB_REG OUTPUT.
    READ TABLE ITAB_REG INDEX TCREG-CURRENT_LINE.
    IF SY-SUBRC EQ 0.
    MOVE-CORRESPONDING ITAB_REREG TO TCREG.
    ELSE.
    EXIT FROM STEP-LOOP.
    ENDIF.
    ENDMODULE.
    MODULE MODIFY_ITAB_REG INPUT.
    MOVE-CORRESPONDING TCREG TO ITAB_REG.
    MODIFY ITAB_REG INDEX
    TCREG-CURRENT_LINE.
    ENDMODULE.
    Updating the internal table
    Method 1
    PROCESS AFTER INPUT.
    LOOP AT ITAB_REG.
    CHAIN.
    FIELD: ITAB_REG-REG,
    ITAB_REG-DESC.
    MODULE MODIFY_ITAB_REG ON CHAIN-REQUEST.
    ENDCHAIN.
    ENDLOOP.
    MODULE MODIFY_ITAB_REG INPUT.
    ITAB_REG-MARK = u2018Xu2019.
    MODIFY ITAB_REG INDEX TCREG-CURRENT_LINE.
    ENDMODULE.
    Method 2
    PROCESS AFTER INPUT.
    LOOP WITH CONTROL TCREG.
    CHAIN.
    FIELD: TCREG-REG,
    TCREG-DESC.
    MODULE MODIFY_ITAB_REG ON CHAIN-REQUEST.
    ENDCHAIN.
    ENDLOOP.
    MODULE MODIFY_ITAB_REG INPUT.
    MOVE-CORRESPONDING TCREG TO ITAB_REG.
    ITAB_REG-MARK = u2018Xu2019.
    MODIFY ITAB_REG INDEX TCREG-CURRENT_LINE.
    ENDMODULE.
    Updating the database
    MODULE USER_COMMAND_100.
    CASE OK_CODE.
    WHEN u2018SAVEu2019.
    LOOP AT ITAB-REG.
    CHECK ITAB_REG-MARK = u2018Xu2019.
    MOVE-CORRESPONDING ITAB_REG TO TCREG.
    UPDATE TCREG.
    ENDLOOP.
    WHEN u2026
    u2026
    ENDCASE.
    ENDMODULE.
    Hope this will solve your problem.
    Regards,
    Pavan.
    Edited by: PAVAN CHANDRASEKHAR GANTI on Aug 3, 2009 12:48 PM

  • How to populate data in PAY_PEOPLE_GROUPS table (People Group Flexfiled)

    Hello
    We are migrating the data from one oracle instance to another oracle instance which are in same version of Oralce Applications 11.5.10.2. As a part of migration can anybody let me know how to populate data in "People Group Key Flexfiled" (PAY_PEOPLE_GROUPS table), ideally I will create or update employee records from the source instance to destination instance, so while creating or updating the employee records in can pass people_group_id while calling to the assignment api but my question here is before passing group id to the api i should have the data populated in PAY_PEOPLE_GROUPS TABLE so that i can fetch the group id as per the combination and pass it in to the api.. please suggest...

    Thanks for your information! by any chance do you have any sample code which will create/update assignments with People Group Flexfield; when i check "hr_assignment_api.update_emp_asg_criteria" it only has parameter to pass people group id and not having segments parameters to pass individual segments.
    Also let me know the links if you have any for all HR API guide which will help me to develope the interfaces...
    My requirement is we have two instances in which in one instance we are treating as source for HR which will be used to master for all HR related activities and we are planning to develope an interface which will bring master instance in sync with dummy instance.

  • Open Hub: How-to doc "How to Extract data with Open Hub to a Logical File"

    Hi all,
    We are using open hub to download transaction files from infocubes to application server, and would like to have filename which is dynamic based period and year, i.e. period and year of the transaction data to be downloaded. 
    I understand we could use logical file for this purpose.  However we are not sure how to have the period and year to be dynamically derived in filename.
    I have read in sdn a number of posted messages on a similar topic and many have suggested a 'How-to' paper titled "How to Extract data with Open Hub to a Logical Filename".  However i could not seem to be able to get document from the link given. 
    Just wonder if anyone has the correct or latest link to the document, or would appreciate if you could share the document with all in sdn if you have a copy.
    Many thanks and best regards,
    Victoria

    Hi,
    After creating open hub press F1 in Application server file name text box from the help window there u Click on Maintain 'Client independent file names and file paths'  then u will be taken to the Implementation guide screen > click on Cross client maintanance of file name > create a logical file path by clicking on new entiries > after creating logical file path now go to Logical file name definition there give your Logical file , name , physical file (ur file name followed by month or year what ever is applicable (press f1 for more info)) , data format (ASC) , application area (BW) and logical path (choose from F4 selection which u have created first), now goto Assignment of  physical path to logical path > give syntax group >physical path is the path u gave at logical file name definition.
    however we have created a logical path file name to identify the file by sys date but ur requirement seems to be of dynamic date of tranaction data...may u can achieve this by creating a variable. U can see the help from F1 that would be of much help to u. All the above steps i have explained will help u create a dynamic logical file.
    hope this helps u to some extent.
    Regards

  • How to set data in rtf document?

    Hi friends,
    I have a rtf document can anyone suggest how to set data in cells of an rtf document?Is there any way?
    Thanks in advance..
    Regards ,
    Soumyanil

    Convert the resultSet from the db to a Object[][], let's call it result.
    Then create a JTable (jTable1).
    On the JTable you need to define the headers, and the data itself.
    You can get the headers from ResultSetMetaData. Convert these to an array again (headers).
    Now use these methods to create a model and set the model of the JTable.
    Model model =  new DefaultTableModel(result, headers);
    jTable1.setModel(model);That's about the basics.
    If you need more info, use at the tutorial at sun's homepage.
    How to use Tables:
    http://java.sun.com/docs/books/tutorial/uiswing/components/table.html

  • How to upload data from excel to form using webutil

    Hi,
    In the sample provided by Oracle
    http://www.oracle.com/technology/products/forms/htdocs/webutil/howto_ole.html
    Note 247606.1 How to Copy Records From a Form Into Excel
    It shown the methods of how to copy data from form to excel but is there any sample to provide the step on how to read the cell from excel into Form in 10g.

    declare
    args client_ole2.list_type;
    application client_ole2.obj_type;
    vworkbooks client_ole2.obj_type;
    vdoc     client_ole2.obj_type;
    vworksheet     client_ole2.obj_type;
    vrange               client_ole2.obj_type;
    begin
    -- create app object
    application := client_ole2.create_obj('Excel.Application');
    client_OLE2.SET_PROPERTY(application, 'Visible','True');
    -- get workbooks object
    vworkbooks := client_ole2.get_obj_property(application, 'Workbooks');
    -- and open a file
    args := client_ole2.create_arglist;
    client_ole2.ADD_ARG(args, 'c:\tp_ae.xls');
    vdoc :=client_ole2.INVOKE_OBJ(vworkbooks,'Open',args);
    client_ole2.destroy_arglist(args);
    -- get a worksheet object
    -- for this to work you need to know the sheet name or its index
    args := client_ole2.create_arglist;
    client_ole2.ADD_ARG(args, 1); <-- name or index
    vworksheet := client_ole2.get_obj_property(vdoc,'Worksheets',args);          
    client_ole2.destroy_arglist(args);
    -- get a range object which in this case is just a cell
    -- for this to work you need to know the cell coordinates
    args := client_ole2.create_arglist;
    client_ole2.ADD_ARG(args, 'B6');          
    vrange := client_ole2.get_obj_property(vworksheet,'Range',args);
    client_ole2.destroy_arglist(args);
    -- and here you get the value
    message(client_ole2.get_char_property(vrange,'Value'));
    -- release objects          
    client_ole2.release_obj(vrange);
    client_ole2.release_obj(vworksheet);
    client_ole2.release_obj(vdoc);
    client_ole2.release_obj(vworkbooks);
    client_ole2.release_obj(application);
    end;

Maybe you are looking for

  • Laserjet m175nw connecting to wifi

    Hi, Can someone please tell me how to setup an HP Laser jet 100 color MFP M175nw for wifi? From a Mac running 0SX 10.6.8? I did the installation from the CD but the "HP setup assistant" says "The selected device (my printer) is not configured, Click

  • XML File from SQL Query

    I have created XML File from Query using below function:- select dbms_xmlgen.getxml(' select * from tblreports where rownum<=1') from dual; "(CLOB) <?xml version="1.0"?> <ROWSET> <ROW> <REPORTID>preRES0011</REPORTID> <ALIAS>Date wise Summary of pins

  • How can I get remote desktop to my pc?

    Hi! I just got my iphone and I want to be able to remotely access our work pc from my phone. Can this be done? In the past I used gotomypc from my lap top, but much rather using my iphone when I'm out on the road. Any solutions? Thanks, Iliana

  • Is there a problem with the new update for Itunes

    twice ive tried to install it and it crashes my computer???

  • Dual displays in Yosemite

    I run a 27-inch iMac with an additional 27-inch Thunderbolt display. Until OS X Mavericks came along the advantage of having dual displays was limited. Mavericks changed this experience and my productivity increased significantly, for example, having