Writing output data via CN2FF to Smar FB/I device

I am having a problem writing data from Control Logix 5K via a 1788-CN2FF linking Device to a Smar FB/I FI302. I have the proper function block and see the data being "sent" from the CN2FF output to the input of the Smar function block (good data), all bocks in "Auto", none in "OOS", however the output of the Smar is not sending any data via it's output. Any ideas?

Hello,
Would you please contact Rockwell for support on this issue? Thanks!

Similar Messages

  • How to  send ALV output data into Excel sheet format via Mail to the user?

    Hi friends,
    I have a doubt ie,
    How to  send ALV output data into Excel sheet format via Mail to the user?
    regards
    Moosa

    Hi,
    Provide the output internal table to the objbin in the below FM
    Send Message
      CALL FUNCTION 'SO_NEW_DOCUMENT_ATT_SEND_API1'
           EXPORTING
                document_data              = i_docdata
                put_in_outbox              = c_x
           TABLES
                packing_list               = i_objpack
                object_header              = i_objhead
                contents_bin               = i_objbin
                contents_txt               = i_objtxt
                receivers                  = i_reclist
    and specify the document type
      i_objpack-doc_type   = 'XLS'.
    and try.
    Regards,
    Nandha

  • Exception writing binary data to the output stream to client -Broken pipe

    Hi,
    I am trying to use the drag & drop feature using Contributor mode of Webcenter sites. Single Image Page Attribute is working properly where as Multiple Image Page Attribute throws the following error:
    [ERROR] [.kernel.Default (self-tuning)'] [logging.cs.satellite.request] Exception writing binary data to the output stream to client 10.191.117.106
    java.net.SocketException: Broken pipe
         at java.net.SocketOutputStream.socketWrite0(Native Method)
         at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:92)
         at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
         at weblogic.servlet.internal.ChunkOutput.writeChunkTransfer(ChunkOutput.java:568)
         at weblogic.servlet.internal.ChunkOutput.writeChunks(ChunkOutput.java:539)
         at weblogic.servlet.internal.ChunkOutput.flush(ChunkOutput.java:427)
         at weblogic.servlet.internal.ChunkOutput$2.checkForFlush(ChunkOutput.java:648)
         at weblogic.servlet.internal.ChunkOutput.write(ChunkOutput.java:333)
         at weblogic.servlet.internal.ChunkOutputWrapper.write(ChunkOutputWrapper.java:148)
         at weblogic.servlet.internal.ServletOutputStreamImpl.write(ServletOutputStreamImpl.java:148)
         at COM.FutureTense.Servlet.ServletRequest$OutputOutputStream.write(ServletRequest.java:80)
         at COM.FutureTense.Servlet.ServletRequest.write(ServletRequest.java:1633)
         at com.openmarket.Satellite.RequestContext.write(RequestContext.java:1123)
         at com.openmarket.Satellite.BytePiece.stream(DataPiece.java:253)
         at com.openmarket.Satellite.CacheObjectImpl.stream(CacheObjectImpl.java:651)
         at com.openmarket.Satellite.Http11Responder.respondForWrapper(Http11Responder.java:142)
         at com.openmarket.Satellite.WrapperAwareResponder.respond(WrapperAwareResponder.java:36)
         at com.openmarket.Satellite.SatelliteServer.execute(SatelliteServer.java:85)
         at com.openmarket.Satellite.servlet.BaseServlet.doGet(BaseServlet.java:118)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:301)
         at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at com.fatwire.wem.sso.cas.filter.CASFilter.doFilter(CASFilter.java:557)
         at com.fatwire.wem.sso.SSOFilter.doFilter(SSOFilter.java:51)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3730)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3696)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2273)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2179)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1490)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)
    Thanks
    KarthiK

    Thank u very much,
         FileOutputStream opGif = new FileOutputStream(destFile, false);
    I have changed above line with the following line:
         PrintWriter opGif = new PrintWriter ( new FileWriter(destFile, false));
    and now this code is working very fine.
    Thanks once again...

  • Error writing output with iview for xml form builder

    Hi,
    I created a Xml Form Builder's project in which I developed an "Edit" and "ListEdit" sheet.
    I also created an iview for theese in which the code link is:"com.sap.km.cm.xmlform",and the field for Style Sheet for List and for single item are set up correctly,but when I tried the preview the following error message happened:
    com.inqmy.lib.xsl.xslt.XSLOutputException: Error writing output. -> org.w3c.dom.DOMException: Root Element is already present, cannot be appended as a child.
    could someone help me?
    thank's a lot!
    Nick.

    Hi,
    Now I'm confused,what do you mean with create new data? Are users editing existing documents
    (as if they go to a document example.xml > edit) or they are creating new documents (as if they go
    on folder > new > forms)?
    The problem is, in both cases a user would need read/write permissions.
    The normal flow content (data) is created in KM is as follows:
    1. user is assigned to a role
    2. role contains KM Navigation iView
    3. KM Navigation iView executes com.sap.km.cm.navigation component
    4. user chooses New > Form UI command (edit_xml_forms)
    5. edit_xml_forms UI command executes its code (com.sapportals.wcm.rendering.uicommand.cm.UIXMLFormsCreateCommand) and open xml edit for the user
    6. user fill the form and click Save, form is created into folder
    For what I understood so far, your requirement basically asks you to go directly to step 5, it is
    possible to pass a URL that goes directly to step 5, the UI command button, but if you do that
    you won't have a context, so chances that it will work are slim, since a context is required to
    fill the parameters asked by the app (like folder, user, permissions, etc). Even though, in some cases you can
    still pass the parameters via post in the URL but you must know which service/parameters the
    app asks for it, also a URL is static...
    That was the create scenario, I think it's more cons than pros, users would still be
    able to bypass the URL iView created for that, I'd suggest evaluating again if it's really
    a problem having users access cm to manage data
    kind regards,
    Rafael

  • Writing HTML data in TEXT File

    I am writing HTML content in TEXT file ...I read it in a string and then write using PrintStream...
    But it writes all content on one line ..I want to write it as it was in source HTML file...

    Perchance, the OP is referring to the lack of carriage returns in his outputted data, and querying where they have absconded to? An analysis of his posts to date reveals that he has read the contents of a File into a String variable, and wants to write the (presumably modified) String back to a file (uncertain as to whether this is the same file or a different one) In the process of outputting to said file, carriage returns previously contained in the original input file are not present in the output. I would hazard a guess that the OP does not want said carriage returns in the String to disappear.
    Is my analysis of the situation correct?
    An example of the code that you have written to date would be of extreme usefulness to us in aiding you in your noble quest.
    Without such a resource, I must rely on my intuition instead.
    Possible places that the carriage returns are being "dropped"
    1 - On reading in. Are you discarding carriage returns while building up your variable?
    2 - On manipulating. Are you processing this line by line, or reading it all into one large variable?
    3 - On output - are you using "print" or "println" in the output?
    I look forward to further correspondence with you sir
    kind regards,
    evnafets

  • How can I convert output data (string?) from GPIB-read to an 1D array?

    Hello all,
    I am reading a displayed waveform from my Tektronix Oscilloscope (TDS3032) via the GPIB Read VI. The format of the waveform data is: positive integer data-point representation with the most significant byte transferred first (2 bytes per data point).
    The output data of GPIB-Read looks like a string(?) where the integer numbers and a sign like the euro-currency sign are seperated by spaces e.g. #5200004C3 4 4 4 4 3C3C3........ (C represents the euro-currency sign).
    How can I convert this waveform data into a 1D/2D array of real double floatingpoint numbers (DBL) so I can handle the waveform data for data-analysis?
    It would be very nice if someone know the solution for this.
    t
    hanks

    Hi,
    First of all, I'm assuming you are using LabVIEW.
    The first you need to do is parse the string returned by the instrument. In this case you need to search for the known symbols in the string (like the euro sign) and chop the string to get the numeric strings. Here are some examples on parsing from www.ni.com:
    Keyword Search: parsing
    Once you parse the numeric strings you can use the "String/number conversion VIs" in the String pallette.
    Hope this helps.
    DiegoF.Message Edited by Molly K on 02-18-2005 11:01 PM

  • Error while writing the data into the file . can u please help in this.

    The following error i am getting while writing the data into the file.
    <bindingFault xmlns="http://schemas.oracle.com/bpel/extension">
    <part name="code">
    <code>null</code>
    </part>
    <part name="summary">
    <summary>file:/C:/oracle/OraBPELPM_1/integration/orabpel/domains/default/tmp/
    .bpel_MainDispatchProcess_1.0.jar/IntermediateOutputFile.wsdl
    [ Write_ptt::Write(Root-Element) ] - WSIF JCA Execute of operation
    'Write' failed due to: Error in opening
    file for writing. Cannot open file:
    C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\
    BPEL_Import_with_Dynamic_Transformation\WORKDIRS\SampleImportProcess1\input for writing. ;
    nested exception is: ORABPEL-11058 Error in opening file for writing.
    Cannot open file: C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\
    BPEL_Import_with_Dynamic_Transformation
    \WORKDIRS\SampleImportProcess1\input for writing. Please ensure 1.
    Specified output Dir has write permission 2.
    Output filename has not exceeded the max chararters allowed by the
    OS and 3. Local File System has enough space
    .</summary>
    </part>
    <part name="detail">
    <detail>null</detail>
    </part>
    </bindingFault>

    Hi there,
    Have you verified the suggestions in the error message?
    Cannot open file: C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\BPEL_Import_with_Dynamic_Transformation\WORKDIRS\SampleImportProcess1\input for writing.
    Please ensure
    1. Specified output Dir has write permission
    2. Output filename has not exceeded the max chararters allowed by the OS and
    3. Local File System has enough space
    I am also curious why you are writing to a directory with the name "..\SampleImportProcess1\input" ?

  • Problem writing meta data changes in xmp in spite of enabled settings

    Dear Adobe Community
    After struggling with this for two full days and one night, you are my last hope before I give up and migrate to Aperture instead.
    I am having problems with Lightroom 5 writing meta data changes into xmp and including development settings in JPEG, inspite of having ticked all three boxed in catalog settings.
    In spite of having checked all boxes, Lightroom refused to actually perform the actions. I allowed the save action to take a lot longer than the saving indicator showed was needed, but regardless of this no edits made in the photo would be visible outside Lightroom. I also tried unticking and ticking and restarting my compute.
    Therefore, I uninstalled the program and the reinstalled it again (the trial version both times). I added about 5000 images to Lightroom (i.e. referenced). After having made a couple of changes for one photo in development settings, I tried closing the program. However, then this message was then displayed:
    I left the program open and running for about 5-5 hours, then tried closing the program, but the message still came up so I closed the program and restarted the computer. I tried making changes to another photo, saving and then closing and the same message comes up. The program also becomes unresponsive, and of course still no meta data has been saved to the photo, i.e. when opening it outside Lightroom, the edits of the photos is not shown.
    What do do? I would greatly appreciate any insights, since I have now completely hit the wall.
    Oh yes, that´s right:
    What version of Lightroom? Include the minor version number (e.g., Lighroom 4 with the 4.1 update).
    Lightroom 5.3
    Have you installed the recent updates? (If not, you should. They fix a lot of problems.)
    I installed the program two days ago and then for the second time today.
    What operating system? This should include specific minor version numbers, like "Mac OSX v10.6.8"---not just "Mac".
    Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.63 Safari/537.36
    What kind(s) of image file(s)? When talking about camera raw files, include the model of camera.
    JPEG
    If you are getting error message(s), what is the full text of the error message(s)?
    Please see screen dumps above
    What were you doing when the problem occurred?
    Trying to save metadata + trying to open images that it seemed I had saved meta data to
    Has this ever worked before?
    No
    What other software are you running?
    For some time Chrome, Firefox, Itunes. Then I closed all other software.
    Tell us about your computer hardware. How much RAM is installed?  How much free space is on your system (C:) drive?
    4 GB 1333 MHz DDR3
    Has this ever worked before?  If so, do you recall any changes you made to Lightroom, such as adding Plug-ins, presets, etc.?  Did you make any changes to your system, such as updating hardware, printers or drivers, or installing/uninstalling any programs?
    No, the problems have been there all the time.

    AnnaK wrote:
    Hi Rob
    I think you succeeded in partly convincing me. : ) I think I will go for a non-destrucitve program like LR when I am back in Sweden, but will opt for a destructive one for now.  Unfortuntately, I have an Olypmus- so judging from your comment NX2 might not be for me.
    Hi AnnaK (see below).
    AnnaK wrote:
    My old snaps are JPEG, but I recently upgraded to an Olympus e-pl5 and will notw (edited by RC) start shooting RAW.
    Note: I edited your statement: I assume you meant now instead of not.
    If you start shooting raw, then you're gonna need a raw processor, regardless of what the next step in the process will be. And there are none better for this purpose than Lightroom, in my opinion. As has been said, you can export those back to Lightroom as jpeg then delete the raws, if storage is a major issue, or convert to Lossy DNG. Both of those options assume you're willing to adopt a non-destructive workflow, from there on out anyway (not an absolute requirement, but probably makes the most sense). It's generally a bad idea to edit a jpeg then resave it as a jpeg, because quality gets progressively worse every time you do that. Still, it's what I (and everybody else) did for years before Lightroom, and if you want to adopt such a workflow then yeah: you'll need a destructive editor that you like (or as I said, you can continue to use Lightroom in that fashion, by exporting new jpegs and deleting originals - really? that's how you want to go???). Reminder: NX2 works great on jpegs, and so is still very much a candidate in my mind - my biggest reservation in recommending it is uncertainty of it's future (it's kinda in limbo right now).
    AnnaK wrote:
    Rob Cole wrote:
    There is a plugin which will automatically delete originals upon export, but relying on plugins makes for additional complication too.
    Which plugin is this?
    Exportant (the option is invisible by default, but can be made visible by editing a text config file). To be clear: I do not recommend using Exportant for this purpose until after you've got everything else setup and functioning, and even then it would be worth reconsidering.
    AnnaK wrote:
    Rob Cole wrote:
    What I do is auto-publish to all consumption destinations after each round of edits, but that takes more space.
    How do you do this?
    Via Publish Services.
    PS - I also use features in 'Publish Service Assistant' and 'Change Manager' plugins (for complete automation), but most people just select publish collections and/or sets and click 'Publish' - if you only have a few collections/services it's convenient enough.
    AnnaK wrote:
    Would you happen to have any tips on which plugins I may want to use together with Photoshop Elements?
    No - sorry, maybe somebody else does.
    Did I get 'em all?
    Rob

  • Writing binary data to a file without carriage returns every 512 bytes

    Is there a VI for writing binary data to a file without carriage returns being inserted every 512 bytes?
    Thanks

    Hi Momolxg,
    I could be way off on this. I tried to simulate what you've done by
    making a for loop that would run a set number of times. For my example I
    used 1025. I wired the iteration terminal to a 'Write to SGL File.vi'
    outside the loop with indexing enabled. It wrote the SGL data from 0 to
    1024 to the file. I then read the file with a 'Read Characters from
    File.vi' and searched the output for a carriage return (0D hex). It was
    found five times. The reason why was the SGL number it was reading had a
    13 (0D hex) in it. Perhaps you're running into a similar problem?
    I tried it again, this time using the 'Write to I16 File.vi'. The
    carriage return was found five times: the 28th character the first time
    then on the 512th character four consecutive time
    s after that. I suppose
    that makes sense that you'd find a 0D in the numbers at equal spacings if
    they're incrementing this way... In this case the carriage returns you're
    seeing are actually numbers from your data.
    One big difference is that I'm using a set pattern of numbers. This
    doesn't appear to be your case. Is there a better way we can duplicate
    your problem? It sounds interesting. Again my simulation could be way
    off. (I'm also running this on LV60 for Linux so my results could be
    different)
    - Kevin
    In article <[email protected]>,
    "momolxg" wrote:
    > Is there a VI for writing binary data to a file without carriage returns
    > being inserted every 512 bytes? Thanks

  • Reading client-posted data via Tomcat servlet

    Hi,
    I'm doing some simple client/servlet communication using Tomcat, using POST's via an HttpURLConnection.
    I'm seeing my data at the servlet side via getInputStream(), but I can't decode it to a String, even by using InputStreamReader (which is supposed to do that). If I send a 5 character String, at the servlet I see "^@^@^@^@^@". I've tried constructing a String from a byte[], but it doesn't decode, either.
    I've tried writing the data as text with PrintWriter on the client side, and then reading text on the servlet side with getReader(), but getReader() returns null.
    I'm beginning to wonder if the problem is within Tomcat itself.
    Reading the docs and api's leads me to believe this should be very straightforward, but it's just not working.
    Any thoughts would be greatly appreciated!!!

    Here's the code... but there's been a strange twist:
    (client side)
    private void writeStringAsText(OutputStream outputStream, String content) throws UnsupportedEncodingException
    PrintWriter printer = new PrintWriter(outputStream);
    String encoded = URLEncoder.encode(content,"UTF-8");
    System.err.println("asText is [" + content + "], encoded = [" + encoded + "]");
    printer.println(encoded);
    printer.flush();
    printer.close();
    Servlet side:
    BufferedReader reader = request.getReader();
    StringBuffer sBuf = new StringBuffer();
    while ((line = reader.readLine()) != null){
    sBuf.append(line);
    Yesterday the problem disappeared, but it reappeared today.
    So, yesterday, the above code worked fine. Today I learned that
    a call to reader.ready() (servlet side) returns false.
    Does anyone know why a BufferedReader would return ready() == false??
    (Other than what the Javadoc say)
    Thanks in advance!

  • Update and transfer data via BADI LE_SHP_TAB_CUST_HEAD

    Greetings All,
    I've a requirement to create a custom tab in the VL01N/VL02N/VL03N header record displaying custom fields.
    I've successfully implemented BADI LE_SHP_TAB_CUST_HEAD, created a subscreen, appended my custom fields to the LIKP table via append structure, and can now view my fields in the transactions listed above.
    I'm having trouble updating the fields in the subscreen and save the values back to the LIKP table.  First question is a) do I do this via the PBO PAI modules in my subscreen, or should I be doing this in the BADI?
    Second question is, if I shuodl be doing this in the BADI, how do I do it.  A simple example is that I have created a field called ZZ_CUST_TIME in LIKP, added it to my sub-screen using data dictionary linking.
    How do I pass a value entered into this field via VL02N back to the transaction for update?
    Any suggestions would be greatfully appreciated.
    Regards,
    Steve

    Dear Abhishek,  can you explain the step..in this step screen is comming but custom fields value is not coming and also likp table is not updated
    Correct AnswerRe: Update and transfer data via BADI LE_SHP_TAB_CUST_HEAD
    Abhisek Biswas Jan 21, 2009 7:28 AM (in response to Stephen Keam)
    Hi Stephen,
    You can do it by using PBO and PAI modules of the screen that you created. But you have to transfer the data from subscreen to the BADI method TRANSFER_DATA_FROM_SUBSCREEN and aslo from method TRANSFER_DATA_TO_SUBSCREEN to the subscreen. This will update the screen field data to LIKP.
    You can aceive this by two ways.
    1) You can use EXPORT in method TRANSFER_DATA_TO_SUBSCREEN and then IMPORT the value in the screen PBO. And You can EXPORT data from screen PAI and IMPORT data in method TRANSFER_DATA_FROM_SUBSCREEN.
    2) Anither way to do it is by using Function modules and Function Group instead of EXPORT/IMPORT.
    Create a Function group. In the global data define a structure/Work Area of type LIKP.
    DATA w_likp TYPE likp.
    Then create two Function modules, one to export data and another to import data.
    Let us assume that the export FM takes in IS_LIKP as input and the import FM outputs the value of LIKP into ES_LIKP.
    Then pass the value is_likp to the export FM in the BADI method TRANSFER_DATA_TO_SUBSCREEN and in the screen PAI pass the LIKP data to the export FM.
    In the export Function module write the following code:
    MOVE is_likp TO w_likp
    Then in the Import FM write the following code:
    MOVE w_likp TO es_likp.
    The import FM is called from method TRANSFER_DATA_FROM_SUBSCREEN and from screen PBO.
    This will solve your problem.
    Regards,
    Abhisek.
    Alert Moderator
    Like (0)
    Reply

  • Ü Character being replaced ? when writing the data to CSV from DB in Linux

    Hi All,
    Can anyone help me in understanding the exact cause of my issue
    Issue Description: Ü Character being replaced ? or when writing the data to CSV from DB in Linux.
    Shell used: ksh
    Actual string: MÜNCHENER RÜCKVERSICHERUNGS-GESELLSCHAFT AG
    Output String: M?NCHENER R?CKVERSICHERUNGS-GESELLSCHAFT AG
    locale ouput:
    LANG=en_US.UTF-8
    LC_CTYPE="en_US.UTF-8"
    LC_NUMERIC="en_US.UTF-8"
    LC_TIME="en_US.UTF-8"
    LC_COLLATE="en_US.UTF-8"
    LC_MONETARY="en_US.UTF-8"
    LC_MESSAGES="en_US.UTF-8"
    LC_PAPER="en_US.UTF-8"
    LC_NAME="en_US.UTF-8"
    LC_ADDRESS="en_US.UTF-8"
    LC_TELEPHONE="en_US.UTF-8"
    LC_MEASUREMENT="en_US.UTF-8"
    LC_IDENTIFICATION="en_US.UTF-8"
    LC_ALL=
    Environment variable set: NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P1
    Linux Version: 2.6.18-128.el5 128 bit
    java Version: 1.6
    Oracle DB version: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
    When i query directly from DB i get the data in right format but when i write the same data to a csv file then the above problem occurs.
    Can anyone please suggest me what could be the cause for this issue?
    Regards,
    Shiva

    Hi Srini,
    We are using Java based ETL specifically written for our application. we are using sqlplus to make DB connections. am using vi editor to see the data. After the Extraction process the data processed in also in the wrong format.
    Regards,
    Shiva

  • Writing String Data into a table

    Hi,
    I am facing a problem in writing the data into table. Below is its description:
    1. I have created a code which reads the URLs from  OPC.server.
    2. I need to display it in a table. Since the number of URLs are known, I have defined the rows of the Table.
    3. But am not able to display those URLs in the Table.
    This problem can be simply put as to "how to write string datatype to the table". can property node be used and if yes, which property can be used??
    I am still  a beginner in this....so would like help from all of fellow LabVIEWers out there
    Thanks & Regards,
    Sushruth.
    Solved!
    Go to Solution.

    KEMLab wrote:
    Hi,
    I am facing a problem in writing the data into table. Below is its description:
    1. I have created a code which reads the URLs from  OPC.server.
    2. I need to display it in a table. Since the number of URLs are known, I have defined the rows of the Table.
    3. But am not able to display those URLs in the Table.
    This problem can be simply put as to "how to write string datatype to the table". can property node be used and if yes, which property can be used??
    I assume you get the URLs as Strings and read it in a loop.
    Let the output autoindex on the loop-out, r-click and create indicator.
    That'll give you an array of strings that'll fill and show once the loop has completed.
    /Y
    LabVIEW 8.2 - 2014
    "Only dead fish swim downstream" - "My life for Kudos!" - "Dumb people repeat old mistakes - smart ones create new ones."
    G# - Free award winning reference based OOP for LV

  • Reading/Writing Bulk Data, getting OutOfMemoryError: Java heap space

    Hello,
    I am reading more than a million record using CursoredStream, following is my code snippet for reading data
    ReadAllQuery query = new ReadAllQuery();
    query.setReferenceClass(DataBk.class);
    query.useCursoredStream();
    CursoredStream stream;
    stream = (CursoredStream)dbses.executeQuery(query);
           while (!stream.atEnd()) {
           vector1 = stream.read(stream.size());
           }       Problem is I am getting
    Exception in thread "main" java.lang.OutOfMemoryError: Java heap spaceI was reading other forums posts about this and it is mentioned that memory heap size could be increased.
    Where do I increase memory heap size?
    java.lang.OutOfMemoryError while opening 9.0.4.5 mwp in 10.1.3.3
    Besides, if my above code is not alright, someone could suggest best approach for reading and writing bulk data.
    Regards
    Edited by: user20090209 on Oct 14, 2009 10:17 AM

    Hello,
    If you are reading a million records and running out of heap, you will still get the exception by reading it using a cursor one at a time. It also depends on how the driver is processing the million records. Using a scrollable cursor might help ( http://wiki.eclipse.org/Using_Advanced_Query_API_(ELUG) ), increasing the heap as suggested will (In Jdev, Java options can be added to the project properties->Run/Debug/Profile->Launch settings), or even changing your app design to use objects more efficiently all around. It depends on what your are using the objects for, but even using pagination (via setting the firstRows/maxResults on a query) to limit the results returned can help depending on how the application requires the objects. What is important is that the objects be free for garbage collection when you are done processing them, and before the next batch needs to be brought in from the database.
    Also, common problems are that reading in one object can bring in multiple trees if relationships aren't managed efficiently. Only use relationships where required, and use indirection where ever possible.
    Best Regards,
    chris

  • Nested IPE (In Place Element) usage when accessing Cluster/Array data via DVR

    I am sharing data across several VIs and loops via a DVR, and accessing the data via a DVR IPE. The data is a cluster of arrays. The diagram below (VI attached) illustrates the structures invloved, but not the structure of the application.
    (The diagram above does not include initialization of the arrays, as it is intended only to illustrate the Cluster1 data type. Array lengths could be 100.)
    The DVR (DVR1) is passed to multiple VIs of the application at startup.
    Each VI executes loops that either read or write particular elements of each array (fArray1 or fArray2).
    I believe the DVR IPE (B1-DVR) provides blocking so that only one task can modify the data (Cluster1) at any time.
    Case 1 illustrates how I currently WRITE to array elements. The outer IPE (block B1) is rolled into a VI (not shown) that takes DVR1, Index, and Value as inputs.
    Cases 2 - 4 illustrate 3 additional methods that remove one or both of the inner IPEs (B2-Cluster and B3-Array).
    Case 2: IPE B3 (Array Index/Replace Elements) is replaced with a non-IPE 'Replace Array Subset'.'
    Case 3: IPE B2 (Unbundle / Bundle Elemnts)' is replaced with a non-IPE cluster 'Unbundle'/'Bundle'.
    Case 4: removes both B2 and B3.
    I implemented case 1 a long time ago.  When I had to do the same thing again recently, I did case 4.  When I stumbled across my earlier implementation, I was a bit suprised
    Which of the 4 cases should take the least time (or resources) to execute? I think case 4 has as few array allocations as any of the other 3.
    The attached image did not capture the Buffer Allocation marks, so I marked the ones that differed with a red "B".
    I am only interested in differences in how the arrays are handled, so I see no signioficant differences.
    Is this one of those cases where LV doesn't need my help?
    Incidently, I recently wrote a small app with shared data and decided to try FGVs to share array data.  For small arrays, 10^7 iterations, and an FGV based array-element read followed by a element write, the FGV was faster.  1.2us per read/write for FGV vs 3us per r/w for an DVR/IPE based read/write (like above).
    Peter
    LV 2011 SP1, Windows 7 64-Bit
    Attachments:
    IPE.vi ‏9 KB

    Option 1 is a definite no and as far as I know it has been NI's explicit intention to steer clear from it. I believe there's an idea in the IE which asks for this.
    I agree that option 2 makes sense, but I don't think it should be something the user specifies. Either LV can detect it automatically or it can't, but I doubt NI would let you have an option which creates the possibility for this kind of bug.
    I'm not sure, but the mark as modifier option on the IPES might be the option you're looking for. I know that it exists and I know very roughly what it does, but the documentation for it is very limited and I never actually played around with it, as usually I don't need these kinds of optimizations.
    You may well be right that a new option on the IPES is desirable and you should probably add it to the idea exchange.
    As for NIWeek, I'm not going this year, so I have no idea what kinds of sessions are around, but it's a great place to find people who know what they're talking about and ask them about it directly. Certain people in LV R&D would probably be ideal for this and if you ask relevant people, you might even get their names. I'm sure buying them a beer would also help to loosen their tounges. If you ask me, this type of interaction is the main value of the conference, not the sessions themselves.
    Try to take over the world!

Maybe you are looking for

  • Stream closed error while testing a composite in EM

    HI, I have deployed a simple composite and wanted to test it in EM. I am getting a stream closed popup, not sure whats wrong? I have restarted the server and problem still persists.

  • Multiple Target files as the item in source file

    Hi all , I am new XI ,my scenario is File to File and my data type structures for source and target are as follows  _Data type for source:     _                Source       Header     1:unbound                                org       1:unbound      

  • JAXB UTFDataFormatException

    Please, can any body help me with my encoding: I am trying to unmashall an xml document, but while unmashalling I got an UTFDataFormatException. The xml document is in a byte[] and this is the way I try to unmashall it: tringReader source = new Strin

  • Bash completion upgrade breaks pacman bc

    hello, today's bash completion upgrade deletes the archlinux file under /etc/bash_completion.d/. pacman's bash completion doesn't work anymore. i've seen that pacman 3.1 has it's own bash completion file. is this somehow related? vlad ps: i'm not usi

  • Problem in using of some code in QA11 Tocde

    hi all,      I want to debeg one function module at the time of using QA11 Tcode. I am using Function Module that is FIELD_EXIT_QPLOS ( This is Field Exit). I am putting dynamic break point and i am trying to run QA11 T code but it is not going to be