Large form set: Deliver in portfolio, or build one big PDF?

Greetings, all--
I am using LiveCycle to recreate a set of 24 or so forms that were originally built in Word. My users download the forms, complete them on their desktops, then print and submit the forms in hard copy. Most users are not tech savvy. The forms must be very easy for them to access, open, navigate, and print. Most also function within severe hardware limitations (slow Internet connections, old machines, etc.). On my end, I have a rusty at best command of scripting and am new to LiveCycle. I know what I want my forms to do but am very slow at figuring out how to make those things happen.
I am working first to decide on how to deliver these forms. It looks like I will have to build one big PDF to get the form behavior I want (autopopulation of like fields, field calculation, etc.). Does that sound right? Or is it possible to set up each file as a separate PDF and package them in a portfolio and still get cross-file behavior like auto field population, sequential page numbering, etc.?
Any feedback on setting up big set of files would be much appreciated.
Thank you,
Virginia

Hi,
For what it is worth...
First of all consider what version of Acrobat/Reader the user population will have. Improved functionality comes with each new version. So you might end up including features in your form that won't work on the users' PC. In LC Designer 8.2 you are able to define the target version (in File/Form Properties/Defaults) and then check in the Warnings tab to make sure that the form will run OK in that version (eg no warnings);
If people don't have to save the form (or the data that they have typed in) then you don't have to worry about Reader Enabling the form. This gives the ability to users with Reader to save that (Reader Enabled) form. Useful if they are filling in the same form regurarly.
The implementation of Portfolios in version 9 is very good. However if users have older versions of Acrobat/Reader then it will revert to the previous implementation of Packages (less graphical) and users will get warning messages.
Keeping the forms separate will help performance; but may make it more difficult for users to locate the correct form. Creating one large form in LC is possible (and will make it very easy to share values across the 24 forms becuase they will be in the one XFA PDF); however if each form has multiple pages and there is dynamic hide/visible script then performance may be a problem. If the form is static (eg does not grow) then performance will not be as badly affected with the single form approach.
There is a work around to get forms to talk to each other (whether in a portfolio or not), but it requires a good bit of scripting and a cool head.
In summary, if you are working with static forms that will not grow (fields not extending to accomodate overflowing text) then I would go with one form.
You can always develop the single form for the time being and then at a later stage break it out into 24 separate forms.
Good luck,
Niall

Similar Messages

  • Exporting one specific pdf form field data to a specific webpage field

    Hello there.
    I am currently creating a form in which I need to export one specific pdf form field data to a specific webpage field to avoid typing it again or hitting ctrl+c and then ctrl+v to the webpage, as there are several records that need to be copied and pasted.
    I read that there is no access to the clipboard within pdf therefore, would like to know if there is any way to do that without accessing the clipboard.
    I am a newbie and have been learning by searching the forums and Google, therefore, would appreciate any insight on whether or not this is possible using javascript or the "submit form" funcion.
    Any help is greatly appreciated!

    Hi George, thanks for your response!
    The main issue I have is that this web page is in fact a government page in which I have to manually copy and paste information.
    I have no idea how to automate this process - I have access to the scripts on the page to see what I am able to do, but since I do not have much experience (only very basic javascripting), any other insight would be great!

  • How to Deal with a Large Form

    Hello all. I have created a very large form using LiveCycle, which utilizes quite a bit of scripting. Unfortunately, as more and more data is added, the form becomes slower and slower, to the point where a user spends more time waiting than actually filling out the form. This is clearly unacceptable, so I'm looking to remedy this. One thought I had was to split the form up into several sections (the format of the form allows this without trouble). I have a few concerns related to this. First, for the people processing the form on the receiving end, processing several forms instead of one is several times as much work and several times as much of a hassle. Second, it is clearly less convenient to distribute several forms than it is to distribute one, so if I were to do this I would be looking for a way to bundle them together (perhaps a PDF Portfolio?). Third, for ease of use I would want to find some way that a user could simply click on a link (or button or whatever) to move on to the second form after finishing the first. If there were some way to combine the separate parts back together after they had been filled out and submit them as one entity, that would seem to solve the first problem, but I have no idea how one might go about doing that. As I mentioned perhaps a PDF Portfolio is the solution to my second question, but I've never worked with those and don't know if they would be suitable. Of course, if there was a way to speed up this form directly that would solve all of these problems in one fell swoop. If anyone is willing to take a look at this form I'd be glad to e-mail it to them (I don't feel comfortable posting it in a public location at the moment).
    Thank you all very much.
    ===========================================
    Update: I just read a different post below about the "remerge" command causing a form to slow down. I used xfa.form.recalculate(1) all over the place in my form. Is it possible this is causing the slowdown?

    Send the form to LiveCycle8@gmail,com and I will have a look when I get a chance.
    Paul

  • Newbie...large form, repeated fields, separate output

    Apologies in advance for being the newbie.  I'm great with programming MS, not done any in Adobe, and I suspect the answer to this is going to be that it'll require scripting.
    I have a need to process a large batch of mixed typed of paper:
    forms filled in by hand, with address/other info
    checks (images of checks)
    forms that have been pre-printed with information
    These will all be one large continuous set of pages (will start as a video file of these getting imaged, then someone will create snapshots of the points in the video where each page is laid under a camera, then the snapshots will be combined into one pdf, which represents all documents that got processed).
    I then need someone to have the ability to do the following, as they go through the pages in Acrobat:
    - depending on the type of image they see (hand form, check images, printed form), click a shortcut, which will create a predetermined set of form fields, in a predetermined place on the screen (effectively layered onto the image).  These form fields will be set up to correlate to the type of image (a hand form will have X number of fields, which allow for address, checkboxes, etc....a check image will have Y number of fields which are address and amount...and so forth)
    - they will then do one of the following:
              have a dropdown, which is fed by some DB (don't know if it is supposed to reside in Adobe somewhere, or can be looking into an Excel/Access, or how this might be done).  They will select from an ID number that the image they're processing needs to tie to (again, ID numbers are already determined in a system elsewhere...this exercise is tying form/check information to that ID number)
              fill in the appropriate fields if it is check or handwritten form
              have the fields for a pre-printed form (which means that the person's information is already in the system, and they printed the form up before submitting it in person) populate the fields in Adobe...so there are X number of fields...in our system we have the information to fill them...based on the lookup that happens in the dropdown, the values that would populate name/address/etc need to also come back and populate the set of fields in that page
              use then just verifies what's on the screen matches the form, or edits as necessary
    all fields then need to output into a structured Excel file (this part I've done with Adobe before, so I am aware of how to do this).
    How much of the above is reasonably simple, for someone who knows scripting?  Anything sound like it's simply not possible?  I'm trying to gauge whether this idea has legs.  Thanks.

    Thanks Jimmy, I read through a few other posts on the forum about vaguely related results and found the problem.
    It seems that when i had copied and pasted a bound subform containing data and then changed the binding on the outer subform, the actual binding on the controls hadn't changed (despite seeming to have done so in designer). When i went into the XML source and manually changed these, the problem seems to have cleared up.
    But thanks anyway :)

  • SQL Adapter Crashes with large XML set returned by SQL stored procedure

    Hello everyone. I'm running BizTalk Server 2009 32 bit on Windows Server 2008 R2 with 8 GB of memory.
    I have a Receive Port with the Transport Type being SQL and the Receive Pipeline being XML Receive.
    I have a Send Port which processes the XML from this Receive Port and creates an HIPAA 834 file.
    Once a large file is created (approximately 1.6 GB in XML format, 32 MB in EDI form), a second file 1.7 GB fails to create.
    I get the following error in the Event Viewer:
    Event Type: Warning
    Event Source: BizTalk Server 2009
    Event Category: (1)
    Event ID: 5740
    Date:  10/28/2014
    Time:  7:15:31 PM
    User:  N/A
    The adapter "SQL" raised an error message. Details "HRESULT="0x80004005" Description="Unspecified error"
    Is there a way to change some BizTalk server settings to help in the processing of this large XML set without the SQL adapter crashing?
    Paul

    Could you check Sql Profiler to trace or determine if you are facing deadlock?
    Is your Adapter running under 64 bits?
    Have you studied the possibility of using SqlBulkInser Adapter?
    http://blogs.objectsharp.com/post/2005/10/23/Processing-a-Large-Flat-File-Message-with-BizTalk-and-the-SqlBulkInsert-Adapter.aspx

  • How to handle large data sets?

    Hello All,
    I am working on a editable form document. It is using a flowing subform with a table. The table may contain up to 50k rows and the generated pdf may even take up to 2-4 Gigs of memory, in some cases adobe reader fails and "gives up" opening these large data sets.
    Any suggestions? 

    On 25.04.2012 01:10, Alan McMorran wrote:
    > How large are you talking about? I've found QVTo scales pretty well as
    > the dataset size increases but we're using at most maybe 3-4 million
    > objects as the input and maybe 1-2 million on the output. They can be
    > pretty complex models though so we're seeing 8GB heap spaces in some
    > cases to accomodate the full transformation process.
    Ok, that is good to know. We will be working in roughly the same order
    of magnitude. The final application will run on a well equipped server,
    unfortunately my development machine is not as powerful so I can't
    really test that.
    > The big challenges we've had to overcome is that our model is
    > essentially flat with no containment in it so there are parts of the
    We have a very hierarchical model. I still wonder to what extent EMF and
    QVTo at least try to let go of objects which are not needed anymore and
    allow them to be garbage collected?
    > Is the GC overhead limit not tied to the heap space limits of the JVM?
    Apparently not, quoting
    http://www.oracle.com/technetwork/java/javase/gc-tuning-6-140523.html:
    "The concurrent collector will throw an OutOfMemoryError if too much
    time is being spent in garbage collection: if more than 98% of the total
    time is spent in garbage collection and less than 2% of the heap is
    recovered, an OutOfMemoryError will be thrown. This feature is designed
    to prevent applications from running for an extended period of time
    while making little or no progress because the heap is too small. If
    necessary, this feature can be disabled by adding the option
    -XX:-UseGCOverheadLimit to the command line."
    I will experiment a little bit with different GC's, namely the parallel GC.
    Regards
    Marius

  • Working with Large data sets Waveforms

    When collection data at a high rate ( 30K ) and for a long period (120 seconds) I'm unable rearrange the data due to memory errors, is there a more efficient method?
    Attachments:
    Convert2Dto1D.vi ‏36 KB

    Some suggestions:
    Preallocate your final data before you start your calculations.  The build array you have in your loop will tend to fragment memory, giving you issues.
    Use the In Place Element to get data to/from your waveforms.  You can use it to get single waveforms from your 2D array and Y data from a waveform.
    Do not use the Transpose and autoindex.  It is adding a copy of data.
    Use the Array palette functions (e.g. Reshape Array) to change sizes of current data in place (if possible).
    You may want to read Managing Large Data Sets in LabVIEW.
    Your initial post is missing some information.  How many channels are you acquiring and what is the bit depth of each channel?  30kHz is a relatively slow acquisition rate for a single channel (NI sells instruments which acquire at 2GHz).  120s of data from said single channel is modestly large, but not huge.  If you have 100 channels, things change.  If you are acquiring them at 32-bit resolution, things change (although not as much).  Please post these parameters and we can help more.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • Unable to Compile large Form

    Hi.
    I have a strange problem. I have a large form (1.98MB and growing). When I try to "compile all" by connecting to my local database, it looks like the form disconnects and hence gives me compilation errors. But if I connect to a remote database (client), it compiles fine. The reason I bring up the size is, it used to compile fine when it was smaller.
    Any advice/suggestions will be greatly appreciated.
    Thanks.

    No Error Messages. Just the regular messages (for example: identifier 'Dual' must be declared.. sql statement ignored) I get when I am not connected to the DB.
    The following is the list of versions I have.
    Forms [32 Bit] Version 6.0.8.19.2 (Production)
    Oracle8i Personal Edition Release 8.1.5.0.0 - Production
    With the Java option
    PL/SQL Release 8.1.5.0.0 - Production
    Oracle Toolkit Version 6.0.8.19.1 (Production)
    PL/SQL Version 8.0.6.3.0 (Production)
    Oracle Procedure Builder V6.0.8.17.0 Build #863 - Production
    PL/SQL Editor (c) WinMain Software (www.winmain.com), v1.0 (Production)
    Oracle Query Builder 6.0.7.1.0 - Production
    Oracle Virtual Graphics System Version 6.0.5.38.0 (Production)
    Oracle Tools GUI Utilities Version 6.0.8.11.0 (Production)
    Oracle Multimedia Version 6.0.8.18.1 (Production)
    Oracle Tools Integration Version 6.0.8.18.0 (Production)
    Oracle Tools Common Area Version 6.0.8.18.0
    Oracle CORE Version 4.0.6.0.0 - Production
    Thanks.

  • Web Services with Large Result Sets

    Hi,
    We have an application where in a call to a web service could potentially yield a large result set. For the sake of argument, lets say that we cannot limit the result set size, i.e., by criteria narrowing or some other means.
    Have any of you handled paging when using Web Services? If so can you please share your experiences considering Web Services are stateless? Any patterns that have worked? I am aware of the Value List pattern but am looking for previous experiences here.
    Thanks

    Joseph Weinstein wrote:
    Aswin Dinakar wrote:
    I ran the test again and I removed the ResultSet.Fetch_Forward and it
    still gave me the same error OutOfMemory.
    The problem to me is similar to what Slava has described. I am parsing
    the result set in memory storing the results in a hash map and then
    emptying the post processed results into a table.
    The hash map turns out to be very big and jvm throws a OutOfMemory
    Exception.
    I am not sure how I can turn this around -
    I can partition my query so that it returns smaller chunks or "blocks"
    of data each time(say a page of data or two pages of data). Then I can
    store a page of data in the table. The problem with this approach is
    that it is not exactly transactional. Recovery would be very difficult
    in this approach.
    I could do this in a try catch block page by page and then the catch
    could go ahead and delete the rows that got committed. The question then
    becomes what if that transaction fails ?It sounds like you're committing the 'cardinal performance sin of DBMS processing',
    of shovelling lots of raw data out of the DBMS, processing it in some small way,
    and sending it (or some of it) back. You should instead do this processing in
    a stored procedure or procedures, so the data is manipulated where it is. The
    DBMS was written from the ground up to be a fast efficient set-based processor.
    Using clever SQL will pay off greatly. Build your saw-mills where the trees are.
    JoeYes we did think of stored procedures. Like I mentioned yesterday some of the post
    processing depends on unicode and specific character sets. Java seemed ideally suited
    to this since it handles these unicode characters very well and has all these libraries
    we can use. Moving this to DBMS would mean we would make that proprietary (not that we
    wont do it if it became absolutely essential) but its one of the reasons why the post
    processing happens in java. Now that you mention it stored procedures seem the best
    option.

  • Planning 11.1.1.1 - Large forms not rendering

    Hi,
    At a client where large forms are not rendering and it's crashing Planning. I vaguely remember there are a few settings that need tweaked between WebSphere and Planning. I think they are XMS and XMX settings or something like that? I remember they can be half of the available RAM etc... Not really sure and thus the question. What settings can we check to speed up form performance and possibly allow larger forms to render ok?
    Thanks

    It sounds as though you're referring to the java heap settings.
    If you're in a Windows environment, these are edited in the registry, using regedit. (HKEY_LOCAL_MACHINE\SOFTWARE\Hyperion Solutions\ . . . look for your Hyperion Planning web app)
    If you're in a Unix environment, these are likely set in the script that starts the services.
    The appropriate settings will depend a bit on whether you're using a 32bit or 64bit OS. Most 32bit OS's limit an application to 2GB and exceeding this can cause the app to crash. With that said, I generally see Planning set to about 1024MB. Tech support used to discourage settings much higher than that. I would set the min and max to the same setting - 1024.
    Hope this helps,
    - Jake

  • I-bot not emailing when report returns large result set..

    Hi,
    I am trying to set up an i-bot to run daily and email the results to the user. Assuming the report in question is Report_A.
    Report_A returns around 60000 rows of data without any filter condition. When I tried to set up thei-bot for Report_A (No filter conditions on the report) the ibot is publishing results to dashboard but is not delivering via email. When I introduce a filter in Report_A to reduce the data returned then everything works fine and email is being sent out successfully.
    So
    1) Is there a size limit for i-bots to deliver by email?
    2) Is there a way to increase the limits if any so the report can be emailed even when returning large result sets?
    Please let me know.

    Sorry for late reply
    Below is the log file for one of the i-bots. Now I am getting an error message "***kmsgPortalGoRequestHasBeenCancelled: message text not found ***" and the i-bot alert message shows as "Cancelled".
    +++ ThreadID: f3c6cb90 : 2010-12-17 23:55:04.551
    [nQSError: 77006] Oracle BI Presentation Server Error: A fatal error occurred while processing the request. The server responded with: ***kmsgPortalGoRequestHasBeenCancelled: message text not found ***
    Error Codes: YLKKAV7S
    Error Codes: AGEGTYVF
    +++ ThreadID: f3c6cb90 : 2010-12-17 23:55:04.553
    iBotID: /shared/_ibots/common/TM/Claims Report
    ...Trying iBot Get Response Content loop again.
    +++ ThreadID: f3c6cb90 : 2010-12-17 23:55:04.554
    ... Sleeping for 8 seconds.
    +++ ThreadID: f3c6cb90 : 2010-12-17 23:55:12.642
    [nQSError: 77006] Oracle BI Presentation Server Error: A fatal error occurred while processing the request. The server responded with: ***kmsgPortalGoRequestHasBeenCancelled: message text not found ***
    Error Codes: YLKKAV7S
    Error Codes: AGEGTYVF
    +++ ThreadID: f3c6cb90 : 2010-12-17 23:55:12.644
    iBotID: /shared/_ibots/common/TM/Claims Report
    ...Trying iBot Get Response Content loop again.
    +++ ThreadID: f3c6cb90 : 2010-12-17 23:55:12.644
    ... Sleeping for 6 seconds.
    +++ ThreadID: f3c6cb90 : 2010-12-17 23:55:18.730
    [nQSError: 77006] Oracle BI Presentation Server Error: A fatal error occurred while processing the request. The server responded with: ***kmsgPortalGoRequestHasBeenCancelled: message text not found ***
    Error Codes: YLKKAV7S
    Error Codes: AGEGTYVF
    +++ ThreadID: f3c6cb90 : 2010-12-17 23:55:18.734
    iBotID: /shared/_ibots/common/TM/Claims Report
    Exceeded number of request retries.

  • Copying large file sets to external drives hangs copy process

    Hi all,
    Goal: to move large media file libraries for iTunes, iPhoto, and iMovie to external drives. Will move this drive as a media drive for a new iMac 2013. I am attempting to consolidate many old drives over the years and consolidate to newer and larger drives.
    Hardware: moving from a Mac Pro 2010 to variety of USB and other drives for use with a 2013 iMac.  The example below is from the boot drive of the Mac Pro. Today, the target drive was a 3 TB Seagate GoFlex ? USB 3 drive formatted as HFS+ Journaled. All drives are this format. I was using the Seagate drive on both the MacPro USB 2 and the iMac USB 3. I also use a NitroAV Firewire and USB hub to connect 3-4 USB and FW drives to the Mac Pro.
    OS: Mac OS X 10.9.1 on Mac Pro 2010
    Problem: Today--trying to copy large file sets such as iTunes, iPhoto libs, iMovie events from internal Mac drives to external drive(s) will hang the copy process (forever). This seems to mostly happen with very large batches of files: for example, an entire folder of iMovie events, the iTunes library; the iPhoto library. Symptom is that the process starts and then hangs at a variety of different points, never completing the copy. Requires a force quit of Finder and then a hard power reboot of the Mac. Recent examples today were (a) a hang at 3 Gb for a 72 Gb iTunes file; (b) hang at 13 Gb for same 72 Gb iTunes file; (c) hang at 61 Gb for a 290 Gb iPhoto file. In the past, I have had similar drive-copying issues from a variety of USB 2, USB 3 and FW drives (old and new) mostly on the Mac Pro 2010. The libraries and programs seem to run fine with no errors. Small folder copying is rarely an issue. Drives are not making weird noises. Drives were checked for permissions and repairs. Early trip to Genius Bar did not find any hardware issues on the internal drives.
    I seem to get these "dropoff" of hard drives unmounting themselves and other drive-copy hangs more often than I should. These drives seem to be ok much of the time but they do drop off here and there.
    Attempted solutions today: (1) Turned off all networking on Mac -- Ethernet and WiFi. This appeared to work and allowed the 72 Gb iTunes file to fully copy without an issue. However, on the next several attempts to copy the iPhoto and the hangs returned (at 16 and then 61 Gb) with no additional workarounds. (2) Restart changes the amount of copying per instance but still hangs. (3) Last line of a crash report said "Thunderbolt" but the Mac Pro had no Thunderbolt or Mini Display Port. I did format the Seagate drive on the new iMac that has Thunderbolt. ???
    Related threads were slightly different. Any thoughts or solutions would be appreciated. Better copy software than Apple's Finder? I want the new Mac to be clean and thus did not do data migration. Should I do that only for the iPhoto library? I'm stumped.
    It seems like more and more people will need to large media file sets to external drives as they load up more and more iPhone movies (my thing) and buy new Macs with smaller Flash storage. Why can't the copy process just "skip" the parts of the thing it can't copy and continue the process? Put an X on the photos/movies that didn't make it?
    Thanks -- John

    I'm having a similar problem.  I'm using a MacBook Pro 2012 with a 500GB SSD as the main drive, 1TB internal drive (removed the optical drive), and also tried running from a Sandisk Ultra 64GB Micro SDXC card with the beta version of Mavericks.
    I have a HUGE 1TB Final Cut Pro library that I need to get off my LaCie Thunderbolt drive and moved to a 3TB WD USB 3.0 drive.  Every time I've tried to copy it the process would hang at some point, roughly 20% of the way through, then my MacBook would eventually restart on its own.  No luck getting the file copied.  Now I'm trying to create a disk image using disk utility to get the file from the Thunderbolt drive and saved to the 3TB WD drive. It's been running for half an hour so far and appears that it could take as long a 5 hours to complete.
    Doing the copy via disk image was a shot in the dark and I'm not sure how well it will work if I need to actually use the files again. I'll post my results after I see what's happened.

  • How to make a form for input in web interface builder

    Hi expert:
        How to make a form for input in web interface builder?I have already used it to do PS planning, but I don't know how to  draw lines and checkboxes . Thanks in advance.
    Allen

    WAD:
    Open the WAD and create a new template. On the left hand navigation you will have several Web Items available. Under 'Standard' you have 'Analysis' item. Pull that into your template to the right. Under the Properties tab you need to pick the query [form/layout] that you have built in Query Designer.
    You will also find other items such as Button group, Checkbox, drop down, list box etc available. Pick and drag into the template whatever it is you require. Lets say you want a button. Under the Properties tab select the 'Command' that you require. You could use standard commands that are available there. You could also define functions and commands that you require.
    Query Designer:
    Open the QD and drag the characteristics and key figures that you require into the rows and columns of the QD. You would need to specify restrictions under the Filter tab of the QD based on the granularity of data that you require. You would need to remember that the key figures need to be made Input Ready [do this by clicking on KF and on the planning tab select "change by user and planning functions"].
    This shouldgive you a start. After you've explored it yourself a bit we can discuss further and I can certainly provide you additional details/material on these areas.
    Srikant

  • Multiple Signatures on Large Plan Sets? - Open to Other Security Measures

    Good afternoon,
    I'm looking for a way to circulate large plan sets to three reviewers for digital signature. The PDFs are, and must remain, formatted to clearly print at 24''x36''. As some plan sets are 12+ pages, this is too large to pass along through our email.
    We do our plan review through an online portal, where each department marks up the PDF, exports comments as FDF, and then the case manager imports all comments into one PDF. I'd prefer to find a solution for signing plans using this portal too, but I imagine it will have to be done offline and then uploaded.
    I'm also open to any other solutions for locking and securing an approved document. If not a signature, maybe a timestamp or watermark on each page? Anything more secure than a simple stamp that can so easily be altered.

    Good afternoon,
    I'm looking for a way to circulate large plan sets to three reviewers for digital signature. The PDFs are, and must remain, formatted to clearly print at 24''x36''. As some plan sets are 12+ pages, this is too large to pass along through our email.
    We do our plan review through an online portal, where each department marks up the PDF, exports comments as FDF, and then the case manager imports all comments into one PDF. I'd prefer to find a solution for signing plans using this portal too, but I imagine it will have to be done offline and then uploaded.
    I'm also open to any other solutions for locking and securing an approved document. If not a signature, maybe a timestamp or watermark on each page? Anything more secure than a simple stamp that can so easily be altered.

  • Interactive Forms within an Acrobat Portfolio--- why not??

    Hi,
    I have a student asking why she isn't able to save a group of interactive forms within an Acrobat Portfolio, and have the Tracker recognize the responses to the form. Sounds as if this shouldn't be a problem. But...
    I've tried it myself, and it appears that when working from within a Portfolio, I have no access to the "Distribute Form" command. Is this correct or is it a bug?
    If I skip the Distribute Form option and instead upload the Portfolio to Acrobat.com, I *am* able to fill out the form and hit "submit". It *does* submit a reply to my email address, in the typical .fdf format. When I go to my email address I can download the .fdf file to my local machine, but after that the system seems to break down.
    TheTracker has no clue that any response has occurred.And when I try to "compile returned forms", the .fdf  files are not recognized: I cannot import the response.
    Is it because there was no "Distribute Form", that the Tracker doesn't recognize responses?
    It would seem like the most useful thing to be able to send a variety of forms to new employees in the Portfolio format. Can it be done? And how?
    Thanks! I hope someone can help, or at least clarify. The student will be back tomorrow and I'd love to have a reply for her.

    Hi "critter_monster" It is by design that the distribute options is not available for files that are in a Portfolio. You can only distribute forms that are not attached to a portfolio. Since tracker only tracks forms distributed the the distribution workflow, it is expected that it would not recognize the .fdf from the submit button.
    Adobe does have a "wish list" form that you can submit feature requests through. You can always submit a request for that functionality. The form is here: https://www.adobe.com/cfusion/mmform/index.cfm?name=wishform&promoid=EWQQL

Maybe you are looking for