Model size limit

Hi,
Is there any size limit for models to be deployed. I am currently working in BI sources with flex 2 compiler in VC 7.00 SP 15 . Compilation of one of the model is successfull buy deployment hangs at 85% .. Is there any space limitation with models ?? where can i check the deployment space consumed ?? ..what can be the reason for this
Thanks
Prashant

Hi I got the following server log file which tells
"Didn't find value to DLGFIND"
"Didn't find value to HISTPORTALURL"
"Didn't find value to HISTLOGINSERVER"
Below is the part of log file . What does it mean .. any one having idea
<!LOGHEADER[START]/>
<!HELP[Manual modification of the header may cause parsing problem!]/>
<!LOGGINGVERSION[1.5.3.7185 - 630]/>
<!NAME[./log/vcserver/vc.log]/>
<!PATTERN[vc.log]/>
<!FORMATTER[com.sap.tc.logging.ListFormatter]/>
<!ENCODING[UTF8]/>
<!FILESET[0, 10, 10485760]/>
<!PREVIOUSFILE[vc.9.log]/>
<!NEXTFILE[vc.1.log]/>
<!LOGHEADER[END]/>
#1.5 #001A4B064A70007800000041000015F600045100F253AE36#1214964414262#/System/Server/VCServer#sap.com/VisualComposerServerEar04#com.sap.portal.vc.server#ANUJ#139##n/a##8878f02047db11ddcdf1001a4b064a70#SAPEngine_Application_Thread[impl:3]_26##0#0#Debug#1#com.sap.portal.vc.server#Plain###VC04 Server: studio.ini didn't find value to DLGFIND#
#1.5 #001A4B064A70007800000043000015F600045100F253B17D#1214964414262#/System/Server/VCServer#sap.com/VisualComposerServerEar04#com.sap.portal.vc.server#ANUJ#139##n/a##8878f02047db11ddcdf1001a4b064a70#SAPEngine_Application_Thread[impl:3]_26##0#0#Debug#1#com.sap.portal.vc.server#Plain###VC04 Server: studio.ini didn't find value to HISTPORTALURL#
#1.5 #001A4B064A70007800000045000015F600045100F253B2D5#1214964414263#/System/Server/VCServer#sap.com/VisualComposerServerEar04#com.sap.portal.vc.server#ANUJ#139##n/a##8878f02047db11ddcdf1001a4b064a70#SAPEngine_Application_Thread[impl:3]_26##0#0#Debug#1#com.sap.portal.vc.server#Plain###VC04 Server: studio.ini didn't find value to HISTLOGINSERVER#
#1.5 #001A4B064A70007800000047000015F600045100F253B412#1214964414263#/System/Server/VCServer#sap.com/VisualComposerServerEar04#com.sap.portal.vc.server#ANUJ#139##n/a##8878f02047db11ddcdf1001a4b064a70#SAPEngine_Application_Thread[impl:3]_26##0#0#Debug#1#com.sap.portal.vc.server#Plain###VC04 Server: studio.ini didn't find value to HISTFILESEARCH#
#1.5 #001A4B064A70007800000049000015F600045100F253B563#1214964414263#/System/Server/VCServer#sap.com/VisualComposerServerEar04#com.sap.portal.vc.server#ANUJ#139##n/a##8878f02047db11ddcdf1001a4b064a70#SAPEngine_Application_Thread[impl:3]_26##0#0#Debug#1#com.sap.portal.vc.server#Plain###VC04 Server: studio.ini didn't find value to STBLASTSESSION#
#1.5 #001A4B064A7000780000004B000015F600045100F253B69C#1214964414264#/System/Server/VCServer#sap.com/VisualComposerServerEar04#com.sap.portal.vc.server#ANUJ#139##n/a##8878f02047db11ddcdf1001a4b064a70#SAPEngine_Application_Thread[impl:3]_26##0#0#Debug#1#com.sap.portal.vc.server#Plain###VC04 Server: studio.ini didn't find value to STBLOGINDEV#
#1.5 #001A4B064A7000780000004D000015F600045100F253B7C6#1214964414264#/System/Server/VCServer#sap.com/VisualComposerServerEar04#com.sap.portal.vc.server#ANUJ#139##n/a##8878f02047db11ddcdf1001a4b064a70#SAPEngine_Application_Thread[impl:3]_26##0#0#Debug#1#com.sap.portal.vc.server#Plain###VC04 Server: studio.ini didn't find value to STBLOGINPWD#
#1.5 #001A4B064A70006F0000002E000015F600045100F2774AC7#1214964416595#/System/Server/VCServer#sap.com/VisualComposerServerEar04#com.sap.portal.vc.server#ANUJ#139##n/a##8c2640b047db11dd8624001a4b064a70#SAPEngine_Application_Thread[impl:3]_29##0#0#Debug#1#com.sap.portal.vc.server#Plain###VC04 Server: LoginSession path is Public/PerformanceManagementDashboard/Mgmt_cockpit_final.mdl readOnly is Public/PerformanceManagementDashboard/Mgmt_cockpit_final.mdl#

Similar Messages

  • Is there a size limit to internal drive?

    For some reason I remember there being a size limit for internal drives on the Power Mac, dual 2.0 GHz G5. I have one 250 Gig drive installed, and want to add another 500 gig drive. Will the drive exceed the size limit?
    If the size is ok, does anyone have a recommendation for a reliable, reasonably fast 500 gig drive?
    Thanks.

    I'm so glad you asked about this. I just bought a G5/1.6 GHz that has an 80GB and I wanted to install a 500 GB second drive but the manual did say that the maximum total between the two drives couldn't be more than a total of 500 GB between the two drives.
    Do we know people that have done/are doing this? I use my G5 for audio recording and that's why I wanted a large second drive and I was thinking that since I'd already had the 80 GB drive already in the G5 I'd have to put nothing larger than a 400 GB in the second drive slot so I wouldn't go over that 500 GB limit I read about in the manual.
    Excellent news if this really is the case that there's no limit to the size of the additional hard disk.
    Since we're on this topic, (I realize this is probably covered elsewhere), but does this same theory apply regarding the 4 GB memory limit that Apple states in the owners manual for the G5 1.6 GHz model?
    It's got four DIMM slots for a maximum of 4GB using four 1 GB DIMMS.
    Is it possible to use four 2 GB DIMMS and get 8 GB of memory running in these earliest G5's?
    The other two other models of the G5 that were released with the 1.6 Ghz were the 1.8 GHz and the Dual 2.0 GHz G5's and they have eight DIMM slots and are capable of using 8 GB.
    So I was wondering if this too was due to Apple not having access to 2 GB DIMM memory modules back in those days (2003) that would have allowed the G5/1.6GHz models to run with 8 GB RAM installed in four slots?
    Thanks guys, really glad I found this post before I bought a smaller hard disk because of what I read in the manual.
    Best,
    John

  • Quicksilver scsi drive size limit ?

    Hello all.
    The original 37gig SCSI drive on my Quicksilver (NOT mirror door) is beginning to sing a sad song so I am looking into replacement.
    I know there is a 128gig size limit for ATA drives in this model but is there any size limit for internal SCSI ????
    I have already added a 120gig ATA with my original 80gig ATA so I think I'm pretty well maxed out (heat and power wise) on internal ATA drives.
    Due to the cost of SCSI drives I'm thinking what I may do is buy an external LARGE firewire dive, transfer the contents of my internal SCSI drive to that firewire drive and then just use the SCSI drive as a dedicated "video capture" drive until it finally dies.
    So, besides the original question about SCSI drive size limits does anyone have anythoughts on going external firewire vs. additional internal drives?
    Thanks.
    Gary
    933mhz PowerPC G4 Mac OS X (10.2.x)

    Hello! If speed is an issue get a sata Raptor drive (most 36 or 80 gigs). The access time is as fast as a 10,000 rpm scsi drive. Just add a sata card to use the sata drive. Tom

  • 3d files/ size limit in CS5?

    Anyone know if theres a 3d file quantity / size limit within a photoshop document? What would any limit be dependant on, e.g, VRAM?
    Running Photoshop 64bit Cs5 Extended, all updates, on a dual xeon, 12gb ram, 64bit win 7, NVidia Quadro FX 3800 (1gb), Raptor scratch disk with 50gb space, used as a dedicated scratch disk. PS settings are set to alocate over 9gb ram to PS, GPU Open GL settings enabled and set to Normal, 3d settings allocate 100% VRAM (990mb) and rendering set to Open GL. You'd expect this to perform admirably and handle most tasks.
    Background:
    Creating a PSD website design file with 3 x 3d files embedded. One 'video' animation file linked and a few smart objects (photos) and the rest is shapes and text with a few mask etc. Nothing unusual other than maybe the vidoe and 3d files. The file size is 500mb, which isnt unusual as I've worked on several 800mb files at the same time all open in the same workspace. PC handles that without any problems.
    Introducing the 3d files and video seems to have hit an error or a limit of some sort but I cant seem to pinpoint whats causing it or how to resolve it.
    Problem:
    I have the one 500mb file I've been working on, open. I try to open any ONE file or create a new one and it says the following error. "Could not complete the command because too many files were selected for opening at once". I've tried with 3 files, other PSD files, JPEGs, anything that can be opened in PS. All with the same message. Only one PSD file open, only trying to opne one more file or create a new file from scratch.
    I've also had a similar error "Could not complete your request because there are too many files open. Try closing some windows & try again". Have re-booted and only opened PS and still the same errors.
    Tried removing the video file and saving a copy. That doesnt work. Removed some of th 3 files and saved a copy and then it sometimes allows me to open more files. Tried leaving the 3d files in and reducing lighting (no textures anyway) and rendering without ray tracing, still no effect. Tried rasterising the files and it allowed more files to be opened. I'm working across a network so tried using local files which made no difference. Only thing that seems to make a difference is removing or rasterising some of the the 3d files.
    Anyone had similar problems on what seems to be a limit either on quantity of 3d files, or maybe complexity limit, or something else to do with 3d files limits? Anyone know of upgrades that might help? I've checked free ram and thats at 7gb, using about 10gb swap file. I've opened 5 documents at the sam time of over 700mb each, and its not caused problems, so I can only think the limit is with the GPU with regards to 3d. Cant get that any higher than 990mb, which I'd assume would be enough anyway if that was the problem. I've palyed about with preferences to adjust the 3d settings to lower but no use.
    Anyone any idea whats limiting it and causing it to give the error message above? Is it even a PS5 limit or a win 7 64bit limit?
    Any ideas greatly appreciated. Thanks all.

    Thanks for your comments Mylenium, I originally thought it might be VRAM, but at 1gb (still quite an acceptable size from what I can tell - I'd expect more than 3 x 3d files for that) I originally dismissed it as the complexity of the files seemed quite low for it to be this. I'm still not completely convinced its the VRAM though because of the error message it gives, and have tried it on more complex 3d models and using more of them and it works fine on those. Seems odd about not letting me create a new document too. Would like the money get a 6gb card but a bit out of the budget range at the moment.
    Do you know of a way to "optimise" 3d files so they take up less VRAM, for example reducing any unwanted textures, materials, vertices or faces within PS in a similar fashion to how illustrator can reduce the complexity/ number of shapes/ points etc? Cant ask the client as they dont have the time or I'd do this . Does rendering quality make a difference, or changing to a smart object? Doesnt seem to from what I've tried.
    Re: using a dedicated 3d program, I'd be a bit reluctant to lose the ability to rotate / edit/ draw onto/ light etc objects within Photoshop now that I have a taste for it and go back to just using 3d renderings, otherwise I'd go down the route as suggested for a dedicated 3d package. Thanks for the suggestion though.

  • "Convert Text to Table" Size limit issue?

    Alphabetize a List
    I’ve been using this well known work around for years.
    Select your list and in the Menu bar click Format>Table>Convert Text to Table
    Select one of the column’s cells (1st click selects entire table, 2nd click selects individual cell)
    Open “Table Inspector” (Click Table icon at top of Pages document)
    Make sure “table” button is selected, not “format” button
    Choose Sort Ascending from the Edit Rows & Columns pop-up menu
    Finally, click Format>Table>Convert Table to Text.
    A few days ago I added items & my list was 999 items long, ~22 pages.
    Tonight, I added 4 more items. Still the same # pages but now 1,003 items long.
    Unable to Convert Text to Table! Tried for 45 minutes. I think there is a list length limit, perhaps 999 items?
    I tried closing the document w/o any changes. Re-opening Pages & re-adding my new items to the end of the list as always & once again when I highlight list & Format>Table>Convert Text to Table .....nothing happens! I could highlight part of the list up to 999 items & leave the 4 new items unhighlighted & it works. I pasted the list into a new doc and copied a few items from the middle of the list & added them to the end of my new 999 list to make it 1003 items long (but different items) & did NOT work. I even attempted to add a single new item making the list an even 1000 items long & nope, not working. Even restarted iMac, no luck.
    I can get it to work with 999 or fewer items easily as always but no way when I add even a single new item.
    Anyone else have this problem?  It s/b easy to test out. If you have a list of say, 100 items, just copy & repeatedly paste into a new document multiple times to get over 1,000 & see if you can select all & then convert it from text to table.
    Thanks!
    Pages 08 v 3.03
    OS 10.6.8

    G,
    Yes, Pages has a table size limit, as you have discovered. Numbers has a much greater capacity for table length, so if you do your sort in Numbers you won't have any practical limitation.
    A better approach than switching to Numbers for the sort would be to download, install and activate Devon Wordservice. Then you could sort your list without converting it to a table.
    Jerry

  • Connection pool size limit error

    Hi all,
    I am trying to execute a BAPI function from MII, execution fails with the following message;
    [ERROR] Unable to make RFC call Exception: [Problem retrieving JCO.Function object: Connection pool <ECC_Server>:800:02:EN:ECCUser is exhausted. The current pool size limit (max connections) is 1 connections.]
    [WARN] [SAP_JCo_Function_0] Skipping execution of output links due to action failure.
    [ERROR] Uncaught exception from SAP_JCo_Function_0, Problem retrieving JCO.Function object: Connection pool <ECC_Server>:800:02:EN:ECCUser is exhausted. The current pool size limit (max connections) is 1 connections.
    Config:
    1. In 'SAP MII: Connections' of type JCO and have given pool size to 100.
    2. In 'SAP MII: Credential Stores' store is created and same is being used in Start Session.
    3. In  JCO_Function block, we can search for the Function Module and set it.
    MII Version:
    14.0.2 Build(82)
    Am I missing something?
    Has any one seen this? please advise.
    Thanks,
    Message was edited by: Shridhar N

    Check if there is another JCo connection configured with the same IP and User. I have found in the past that even though there are two connections configured because they have the same ip and user they are put into one pool with the lowest max pool of the two connections.

  • Is there a way to put more apps (increase the folder size limit) in iOS 5.0.1 on iPad2

    I read on some site that there is an app that allows one to increase the desktop folder size limit in iOS 5.0.1 on an iPad2 does anyone have any info please?
    Thanks for any thoughts in advance.....
    Dave

    Firstly, Thanks for taking the time to reply :-) 
    It's not that I want gazillions of apps, I just wanted to put more in each folder so I don't have to have multiple folders with similar names....Weather1, 2, 3,etc.... but I see your point and appreciate the thoughts.
    Dave

  • How do I change the attachment size limit in Calendar Server 6.3, UWC, IWC?

    How do I properly increase or decrease the attachment size limit with Calendar Server and all supported user interfaces to it such as WCAP, UWC (Communications Express) and IWC (Convergence)? From my experience with the Outlook Connector, there seems to be some limit imposed by cshttpd on the size of a file upload (I believe I actually got an HTTP error code back on the wcap request indicating something was too big, sorry I don't have it handy, I'll have to re-test). Additionally, it seems UWC imposes additional limits (Example: http://docs.sun.com/app/docs/doc/819-4440/6n6jfgcjh?l=en&a=view&q=fileSizeHardLimit) but I can't seem to get those to work at all. I found many different web.xml related to UWC and I'm not sure which one to change, I tried a couple but had no success because UWC would always report this error if I uploaded between 4-5 megs: com.iplanet.jato.util.WrapperRuntimeException
    Root cause = [java.io.IOException: Request cancelled because file input field
    "importFile" size is over the configurable limit of 4194304 bytes; see filter init
    parameter fileSizeHardLimit]
    And it would complain about requestSizeLimit I think if it was over 5 megs, claiming that limit was 5242880. IWC gives a generic error when the upload is too big and rejects it.
    I fear that a 4 meg limit will be too imposing and of limited value, so I would either like to raise it, or consider lowering it to 0 bytes so attachments cannot be used at all. I have been looking high and low for information on how to do this and all I can find is the UWC examples. I plan to support the Outlook Connector, UWC, and IWC so the limits should ideally be the same across each. Some of the Exchange data we wish to import does have some attachments so it would be good to continue support for that. I did see some other posts about quota RFEs but at this point I am not concerned about the disk consumption. Can anyone help? Thanks. Please let me know if there is any more information I can provide. I am running SCS6u1 on Solaris 10 SPARC.

    Fred@egr wrote:
    Thanks!!! This is working with IWC and I am pretty sure it will work with Outlook. I didn't think to look at config options for mshttpd since I don't have it installed and ics.conf doesn't list the http.service.maxmessagesize and service.http.maxpostsize by default.http.service.maxmessagesize is only relevant to mshttpd, not cshttpd. service.http.maxpostsize applies to both.
    UWC is still limiting me though; I'm sure I can reconfigure UWC if I just know which file to edit and if I need to redeploy anything. I'm using the same install paths as the SCS6 Single Host example and I'm not sure which the "uwc-deployed-path" is supposed to be. Again, thanks.If you have deployed UWC/CE to Application Server you would edit the following file and restart application-server:
    /opt/SUNWappserver/domains/domain1/generated/xml/j2ee-modules/Communications_Express/web.xml
    e.g.
      <filter>
        <filter-name>MultipartFormServletFilter</filter-name>
        <filter-class>com.sun.uwc.calclient.MultipartFormServletFilter</filter-class>
        <init-param>
          <param-name>fileSizeHardLimit</param-name>
          <param-value>15485760</param-value>
        </init-param>
        <init-param>
          <param-name>requestSizeLimit</param-name>
          <param-value>15485760</param-value>
        </init-param>
        <init-param>
          <param-name>fileSizeLimit</param-name>
          <param-value>15485760</param-value>
        </init-param>
      </filter>Regards,
    Shane.

  • LabView RT FTP file size limit

    I have created a few very large AVI video clips on my PXIe-8135RT (LabView RT 2014).  When i try to download these from the controller's drive to a host laptop (Windows 7) with FileZilla, the transfer stops at 1GB (The file size is actually 10GB).
    What's going on?  The file appears to be created correctly and I can even use AVI2 Open and AVI2 Get Info to see that the video file contains the frames I stored.  Reading up about LVRT, there is nothing but older information which claim the file size limit is 4GB, yet the file was created at 10GB using the AVI2 VIs.
    Thanks,
    Robert

    As usual, the answer was staring me right in the face.  FileZilla was reporting the size in an odd manner and the file was actually 1GB.  The vi I used was failing.  After fixing it, it failed at 2GB with error -1074395965 (AVI max file size reached).

  • [AS3 AIR] 2880x2880 Size Limit, Camera, Filter, CameraRoll Issues & Finger Friendly Components

    AS3 AIR ANDROID
    I started playing with this Adobe AIR for Adroid by building an app that would let the user take a picture using the mobile device's native camera app.  Then, I'm applying filter effects to make the image look cool.  Then, I'm allowing the user to save the image back to the camera roll.
    Here are some questions that I have:
    KEEPING UP WITH CURRENT TECHNOLOGY
    Are we limited to the 2880x2880 stage size limit?  Although, this dimension does yield 8+megapixels, it's not in the ratio that most camera sensors are built (widescreen).  Plus, you can bet that newer cameras will have even higher dimensions.  Will this be updated to keep up with current technology requirements?
    IMPORTING & MANIPULATING CAMERA DATA
    Code
    var bmpData:BitmapData = new BitmapData($loader.width, $loader.height);
    bmpData.draw(DisplayObject($loader));
    bmp = new Bitmap(bmpData);
    bmp.width = Capabilities.screenResolutionX;
    bmp.height = Capabilities.screenResolutionY;
    if (CameraRoll.supportsAddBitmapData) {
        var cameraRoll:CameraRoll = new CameraRoll();              
        cameraRoll.addEventListener(ErrorEvent.ERROR, onCrError);
        cameraRoll.addEventListener(Event.COMPLETE, onCrComplete);
        var savedBmpData:BitmapData = new BitmapData (bmp.width, bmp.height);
        savedBmpData.draw(DisplayObject(bmp));
        cameraRoll.addBitmapData(savedBmpData);
    } else {
        trace("~" + "Camera Roll not supported for this device.");
    addChild(bmp);
    When you capture an image using the mobile device's camera app, you have to use the Loader object.
    So, here, I am doing just that with these steps:
    First, I'm creating a BitmapData object and sizing it the same as the camera image.
    Pass the camera image into the BitmapData object.
    Create a Bitmap object and pass in the BitmapData.
    Resize it to fit on the stage.
    Check for Camera Roll and then create a savedBmpData BitmapData object at the size of the screen.
    Pass in the bmp.
    Save it to the Camera Roll.
    The problem is that when the image is displayed on the phone, it shows THE ENTIRE (uncropped) image.  However, the image that is saved to the phone is only the top 800x480 corner of the image.  What is wrong?  How do we save an image to the Camera Roll that is larger than the display area of the phone.  It seems like the only way to save the entire camera image is to resize it to fit the stage and then capture the stage into a bitmapData object.
    FILTERS
    If you apply any filters to the bitmapData object, the filter effects will display on the phone, but if the image is saved to the Camera Roll, then all the filters are lost and it only saves the original (unfiltered) image.
    FINGER FRIENDLY UI COMPONENTS
    Do they exist?
    ADDITIONAL NOTES
    The max image size that can be saved is 2039x2039 pixels on the HTC Evo.  Anything bigger than this resulted in a CameraRoll Error #1 (which there is no documentation for what that means ANYWHERE on the web).

  • HT4863 How can I increase the file size limit for outgoing mail. I need to send a file that is 50MB?

    How can I increase the file size limit for outgoing mail. I need to send a file that is 50MB?

    You can't change it, and I suspect few email providers would allow a file that big.  Consider uploading it to a service like Dropbox, then email the link allowing the recipient to download it.

  • FILE and FTP Adapter file size limit

    Hi,
    Oracle SOA Suite ESB related:
    I see that there is a file size limit of 7MB for transferring using File and FTP adapter and that debatching can be used to overcome this issue. Also see that debatching can be done only for strucutred files.
    1) What can be done to transfer unstructured files larger than 7MB from one server to the other using FTP adapter?
    2) For structured files, could someone help me in debatching a file with the following structure.
    000|SEC-US-MF|1234|POPOC|679
    100|PO_226312|1234|7130667
    200|PO_226312|1234|Line_id_1
    300|Line_id_1|1234|Location_ID_1
    400|Location_ID_1|1234|Dist_ID_1
    100|PO_226355|1234|7136890
    200|PO_226355|1234|Line_id_2
    300|Line_id_2|1234|Location_ID_2
    400|Location_ID_2|1234|Dist_ID_2
    100|PO_226355|1234|7136890
    200|PO_226355|1234|Line_id_N
    300|Line_id_N|1234|Location_ID_N
    400|Location_ID_N|1234|Dist_ID_N
    999|SSS|1234|88|158
    I would need a the complete data in a single file at the destination for each file in the source. If there are as many number of files as the number of batches at the destination, I would need the file output file structure be as follows:
    000|SEC-US-MF|1234|POPOC|679
    100|PO_226312|1234|7130667
    200|PO_226312|1234|Line_id_1
    300|Line_id_1|1234|Location_ID_1
    400|Location_ID_1|1234|Dist_ID_1
    999|SSS|1234|88|158
    Thanks in advance,
    RV
    Edited by: user10236075 on May 25, 2009 4:12 PM
    Edited by: user10236075 on May 25, 2009 4:14 PM

    Ok Here are the steps
    1. Create an inbound file adapter as you normally would. The schema is opaque, set the polling as required.
    2. Create an outbound file adapter as you normally would, it doesn't really matter what xsd you use as you will modify the wsdl manually.
    3. Create a xsd that will read your file. This would typically be the xsd you would use for the inbound adapter. I call this address-csv.xsd.
    4. Create a xsd that is the desired output. This would typically be the xsd you would use for the outbound adapter. I have called this address-fixed-length.xsd. So I want to map csv to fixed length format.
    5. Create the xslt that will map between the 2 xsd. Do this in JDev, select the BPEL project, right-click -> New -> General -> XSL Map
    6. Edit the outbound file partner link wsdl, the the jca operations as the doc specifies, this is my example.
    <jca:binding  />
            <operation name="MoveWithXlate">
          <jca:operation
              InteractionSpec="oracle.tip.adapter.file.outbound.FileIoInteractionSpec"
              SourcePhysicalDirectory="foo1"
              SourceFileName="bar1"
              TargetPhysicalDirectory="C:\JDevOOW\jdev\FileIoOperationApps\MoveHugeFileWithXlate\out"
              TargetFileName="purchase_fixed.txt"
              SourceSchema="address-csv.xsd" 
              SourceSchemaRoot ="Root-Element"
              SourceType="native"
              TargetSchema="address-fixedLength.xsd" 
              TargetSchemaRoot ="Root-Element"
              TargetType="native"
              Xsl="addr1Toaddr2.xsl"
              Type="MOVE">
          </jca:operation> 7. Edit the outbound header to look as follows
        <types>
            <schema attributeFormDefault="qualified" elementFormDefault="qualified"
                    targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/file/"
                    xmlns="http://www.w3.org/2001/XMLSchema"
                    xmlns:FILEAPP="http://xmlns.oracle.com/pcbpel/adapter/file/">
                <element name="OutboundFileHeaderType">
                    <complexType>
                        <sequence>
                            <element name="fileName" type="string"/>
                            <element name="sourceDirectory" type="string"/>
                            <element name="sourceFileName" type="string"/>
                            <element name="targetDirectory" type="string"/>
                            <element name="targetFileName" type="string"/>                       
                        </sequence>
                    </complexType>
                </element> 
            </schema>
        </types>   8. the last trick is to have an assign between the inbound header to the outbound header partner link that copies the headers. You only need to copy the sourceDirectory and SourceGileName
        <assign name="Assign_Headers">
          <copy>
            <from variable="inboundHeader" part="inboundHeader"
                  query="/ns2:InboundFileHeaderType/ns2:fileName"/>
            <to variable="outboundHeader" part="outboundHeader"
                query="/ns2:OutboundFileHeaderType/ns2:sourceFileName"/>
          </copy>
          <copy>
            <from variable="inboundHeader" part="inboundHeader"
                  query="/ns2:InboundFileHeaderType/ns2:directory"/>
            <to variable="outboundHeader" part="outboundHeader"
                query="/ns2:OutboundFileHeaderType/ns2:sourceDirectory"/>
          </copy>
        </assign>you should be good to go. If you just want pass through then you don't need the native format set to opaque, with no XSLT
    cheers
    James

  • JVM heap size limit under Windows

    Hi,
    I'm looking either for some help with a workaround, or
    confirmation that the information I've found is still the case for the
    current state of Java.
    Development machine is Win XP Pro, 2G RAM.
    Biggest heap I can allocate is about 1.6G, and that is not large enough for this
    app.
    I have a Swing application that
    1) must run on Win XP, 32 bit
    2) must implement an editor (similar to Excel but with fewer features) to handle large csv files
    ( up to about 800Mb).
    3) Strong preference for Java 5, though higher could conceivably be supported.
    Research so far tells me that this is the result of process memory limitations
    of Windows and the JVM, and that I might be able to squeeze a little more heap with
    Windows' rebase command, but probably not enough and I would start running the
    risk of conflicts with other applications on my users' systems. Ugh.
    Also I read of the Windows /3GB switch, but posts say that the JDK's available are not
    built to be able to use that feature. I havent had a chance to add memory to
    test that yet. However, I'm also under the impression that I should be able to
    allocate a heap larger than physical RAM ... except for that process size limit.
    So ... my information is basically that I'm stuck with a limit of about 1.6G for
    heap size, regardless of the RAM on my computer.
    Can anyone confirm whether that is still correct, preferably with a pointer to some
    official reference ?
    Or better yet, point me toward a workaround?
    Thanks!
    -tom

    >
    Some bookmarks I have on this topic.
    http://sinewalker.wordpress.com/2007/03/04/32-bit-windows-and-jvm-virtual-memory-limit/
    http://stackoverflow.com/questions/171205/java-maximum-memory-on-windows-xp
    The first link pulled together what I found in lots of bits and pieces elsewhere, nice to have a coherent summary :)
    The second link offered a bit of insight into the jvm that I hadn't seen yet .
    Thanks!

  • Message size limit

    Hi all,
    I have a question regarding the message size for the mapping:
    What is the size limit for messages when using the XSLT mapping methods?
    What are the maximum message sizes (if any) for the other methods like ABAP mapping, Graphical mapping, JAVA Mapping and XSLT Mapping (JAVA) ?
    Looking foward to hear from you:)
    Regards
    Markus

    Hi Markus,
    Just go through the belwo thread which talks about the same.
    Wat is the maximu size of data XI can handle
    Also go through the mapping performance which helps u in understanding the performance as well.
    Mapping Performance:
    /people/udo.martens/blog/2006/08/23/comparing-performance-of-mapping-programs
    Thnx
    Chirag
    Reward points if it helps.

  • S1000 Data file size limit is reached in statement

    I am new to Java and was given the task to trouble shoot a java application that was written a few years ago and no longer supported. The java application creates database files the user's directory: diwdb.properties, diwdb.data, diwdb.lproperties, diwdb.script. The purpose of the application is to open a zip file and insert the files into a table in the database.
    The values that are populated in the diwdb.properties file are as follows:
    #HSQL Database Engine
    #Wed Jan 30 08:55:05 GMT 2013
    hsqldb.script_format=0
    runtime.gc_interval=0
    sql.enforce_strict_size=false
    hsqldb.cache_size_scale=8
    readonly=false
    hsqldb.nio_data_file=true
    hsqldb.cache_scale=14
    version=1.8.0
    hsqldb.default_table_type=memory
    hsqldb.cache_file_scale=1
    hsqldb.log_size=200
    modified=yes
    hsqldb.cache_version=1.7.0
    hsqldb.original_version=1.8.0
    hsqldb.compatible_version=1.8.0
    Once the databsae file gets to 2GB it brings up the error meessage 'S1000 Data file size limit is reached in statement (Insert into <tablename>......
    From searching on the itnernet it appeared that the parameter hsqldb.cache_file_scale needed to be increased & 8 was a suggested value.
    I have the distribution files (.jar & .jnlp) that are used to run the application. And I have a source directory that was found that contains java files. But I do not see any properties files to set any parameters. I was able to load both directories into NetBeans but really don't know if the files can be rebuilt for distribution as I'm not clear on what I'm doing and NetBeans shows errors in some of the directories.
    I have also tried to add parameters to the startup url: http://uknt117.uk.infores.com/DIW/DIW.jnlp?hsqldb.large_data=true?hsqldb.cache_file_scale=8 but that does not affect the application.
    I have been struggling with this for quite some time. Would greatly appreciate any assistance to help resolve this.
    Thanks!

    Thanks! But where would I run the sql statement. When anyone launches the application it creates the database files in their user directory. How would I connect to the database after that to execute the statement?
    I see the create table statements in the files I have pulled into NetBeans in both the source folder and the distribution folder. Could I add the statement there before the table is created in the jar file in the distribution folder and then re-compile it for distribution? OR would I need to add it to the file in source directory and recompile those to create a new distribution?
    Thanks!

Maybe you are looking for

  • How can I transfer my iTunes library to a new Mac from a Macbook

    How can I transfer my iTunes library to a new Mac from a Macbook

  • XI Message - Normalized to RFC-XML format

    In a follow up to this posting: Re: How can the payload of a XI message be normalized to the RFC-XMLuFF1F Can someone tell me what the proceedure is to install/use the rfcnormalizer.jar file? Thanks so much, John

  • Little problem with parallax

    i use for stage (this is responsive) width : with % heigth : with px i have in my compositionReady some fixebale element like that $("body").append($("#Stage_element").css("position", "fixed")); and function for parallax : yepnope({     load: "http:/

  • Photo from Pa30 into Adobe Forms.

    Hello, We have uploaded the photos of Employees in HR server. Now in the Adobe Forms, we needs to fetch these photo's and display. Is it possibe? I know that through webdynpro , we can pass the photo as Xstring and display in Adobe form. We want in t

  • Authorization SEM

    Hello, I have a problem in the authorization of a SEM transaction. Could some give me some help? We build a SEM transaction which displays some transaction of data. I authorized the user to display the data. But if I want to enter the SEM transaction