Problems with Merge Data Files

Created a form with radio buttons and finally have it working on my computer including the Merge Data Files creating the report.csv file.  However, when my co-worker creates the report.csv, the columns for the radio buttons are not included.  Had this problem on my computer when I had names for the radio buttons, but fixed it.  The data columns for the radio buttons are not exported on the co-workers computer.  Both are Pro X version, mine 10.0.0 but co-worker's is 10.1.7.
Any suggestions?

Back up all data.
Please triple-click anywhere in the line below on this page to select it:
defaults delete -app Safari WebKitOmitPDFSupport
Copy the selected text to the Clipboard by pressing the key combination command-C.
Quit Safari. Launch the built-in Terminal application in any of the following ways:
☞ Enter the first few letters of its name into a Spotlight search. Select it in the results (it should be at the top.)
☞ In the Finder, select Go ▹ Utilities from the menu bar, or press the key combination shift-command-U. The application is in the folder that opens.
☞ Open LaunchPad. Click Utilities, then Terminal in the icon grid.
Paste into the Terminal window by pressing the key combination command-V. I've tested these instructions only with the Safari web browser. If you use another browser, you may have to press the return key after pasting.
Wait for a new line ending in a dollar sign (“$”) to appear below what you entered. You can then quit Terminal. Test.

Similar Messages

  • Listbox problem with merge data option

    Listboxes don't show up as all other objects do when I use "More form options" and "Merge data files into spreadsheet." Is this a known issue? Is there a solution? I'm using LeveCycle ES4.

    It got fixed by removing a filter from a dimension, and placing that dimension as a page.
    Thanks,
    Hector

  • Problem with preview PDF files on Windows 7 32bit

    Hello,
    in my office I have a problem with preview PDF files on my computers. The files are allright, but if I select a file and click on preview mode after ~ 10sec operation system write something "Preview this file is corrupt for PDF Preview Handler wrong". On other 64bit computers with same programs the preview is work. I try uninstall all Adobe soft (Reader, Flash), clean all Adobe data from folders in root and in register, but it is not solve this problem. On Microsoft forums wrote me, that the problem is probably with AR plugin.
    So, can someone help me?
    Config PC:
    Windows 7 32/64bit Pro
    Intel i3, 4GB
    Adobe Reader 11.0.5
    Adobe Flash Player 15 ActiveX
    Adobe Flash Player 15 plugin
    Preview Handler Surrogate Host ver.: 6.1.7601.17514
    Thank you for help.

    I think I figured it out after reading other forums. Thanks!

  • InDesign CS5 causing problems with InCopy assignment file but not content files

    I'm having a strange InDesign/InCopy CS5 issue (Mac OSX 10.6.6; all system & Adobe updates are current).  In my InDesign doc I have one assignment file containing two content files. All of the InCopy pieces work fine in InDesign (i.e., I can check out/in, edit, etc.).  When I go to open the assignment file in InCopy, InCopy starts to open  it but then crashes. If I try to open each content file directly in  InCopy, no problem. I've tried deleting and recreating new assignments  in InDesign, but they all have the same result in InCopy.
    I think this actually is a problem with the InDesign file because if I export the InDesign file to IDML, then try to open the IDML file, InDesign crashes! I've trashed my InDesign preferences, made sure all cross-references are up to date, etc. Anybody have any ideas what else to look for in my InDesign file that might cause this problem?
    Thanks!
    Andrea

    Thanks John!
    1) Here are the first 10-ish lines from the crash report that is generated when I try to open the IDML file:
    Thread 0 Crashed:  Dispatch queue: com.apple.main-thread
    0   ???                               0xa0c266f0 _XHNDL_trapback_instruction + 0
    1   com.adobe.InDesign.Indexing       0x20778192 GetPlugIn + 341394
    2   com.adobe.InDesign.Indexing       0x20779009 GetPlugIn + 345097
    3   PublicLib.dylib                   0x0129228b CScriptProvider::AccessProperties(IScriptRequestData*, IScript*) + 571
    4   com.adobe.InDesign.Scripting      0x1f4befe8 GetPlugIn + 166440
    5   com.adobe.InDesign.Scripting      0x1f4c2e31 GetPlugIn + 182385
    6   com.adobe.InDesign.INXCore        0x20641444 GetPlugIn + 45220
    7   PublicLib.dylib                   0x0142c9e8 CScriptDOMElement::GetMultipleAttributes(K2Vector<IDType<ScriptID_tag>, K2Allocator<IDType<ScriptID_tag> > > const&, adobe::version_1::vector<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue>, adobe::version_1::capture_allocator<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue> > >&) + 280
    8   PublicLib.dylib                   0x0142bd38 CScriptDOMElement::InsertProperties(adobe::version_1::vector<KeyValuePair<IDType<ScriptID _tag>, ScriptData>, adobe::version_1::capture_allocator<KeyValuePair<IDType<ScriptID_tag>, ScriptData> > >&, adobe::version_1::vector<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue>, adobe::version_1::capture_allocator<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue> > > const&, short, short) + 1336
    9   PublicLib.dylib                   0x0142beab CScriptDOMElement::SetSimpleAttributes(adobe::version_1::vector<KeyValuePair<IDType<Scrip tID_tag>, DOMAttributeValue>, adobe::version_1::capture_allocator<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue> > > const&, short) + 91
    10  PublicLib.dylib                   0x0142c0d3 CScriptDOMElement::SetAttributes(adobe::version_1::vector<KeyValuePair<IDType<ScriptID_ta g>, DOMAttributeValue>, adobe::version_1::capture_allocator<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue> > > const&) + 51
    11  PublicLib.dylib                   0x0142a2ea CScriptDOMElement::SetAttribute(IDType<ScriptID_tag>, DOMAttributeValue const&) + 202
    2) That works. Moved all pages (frame threading was preserved) to a new doc, exported to IDML, opened new IDML file just fine. Woo hoo! One of my editors won't be pleased that all of the tracked changes have disappeared, but at least the file functions now
    Wish I knew what caused the problem in the first place so we could (hopefully) prevent this from happening again. Any ideas?
    Thanks again for your help!
    Cheers,
    Andrea

  • Problem with loading data to Essbase

    Hi All,
    I have a problem with loading data into Essbase. I've prepared maxl script to load the data, calling rule file. The source table is located in RDBMS Oracle. The script works correctly, ie. generally loads data into Essbase.
    But the problem lies in the fact, that after deletion of data from Essbase, when I'm trying to load it again from the source table I get the message: WARNING - 1003035 - No data values modified by load of this data file - although there is no data in Essbase... I've also tried to change the mode of loading data from 'overwrite' to 'add to existing values' (in rule file) but it does'nt help ... Any ideas what can I do?

    Below few lines from EPM_ORACLE_INSTANCE/diagnostics/logs/essbase/dataload_ODL.err:
    [2013-09-24T12:01:40.480-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received Validate Login Session request
    [2013-09-24T12:01:40.482-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received client request: Get App and Database Status (from user [admin@Native Directory])
    [2013-09-24T12:01:54.488-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: MaxL: Execute (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Received client request: MaxL: Describe (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [MLEXEC-2] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Output columns described
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Define (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [MLEXEC-3] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Record(s) fetched
    [2013-09-24T12:01:54.496-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.498-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received Validate Login Session request
    [2013-09-24T12:01:54.499-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: Get Application State (from user [admin@Native Directory])

  • Performance problem with MERGE statement

    Version : 11.1.0.7.0
    I have an insert statement like following which is taking less than 2 secs to complete and inserts around 4000 rows:
    INSERT INTO sch.tab1
              (c1,c2,c3)
    SELECT c1,c2,c3
       FROM sch1.tab1@dblink
      WHERE c1 IN (SELECT c1 FROM sch1.tab2@dblink);I wanted to change it to a MERGE statement just to avoid duplicate data. I changed it to following :
    MERGE INTO sch.tab1 t1
    USING (SELECT c1,c2,c3
       FROM sch1.tab1@dblink
      WHERE c1 IN (SELECT c1 FROM sch1.tab2@dblink) t2
    ON (t1.c1 = t2.c1)
    WHEN NOT MATCHED THEN
    INSERT (t1.c1,t1.c2,t1.c3)
    VALUES (t2.c1,t2.c2,t2.c3);The MERGE statement is taking more than 2 mins (and I stopped the execution after that). I removed the WHERE clause subquery inside the subquery of the USING section and it executed in 1 sec.
    If I execute the same select statement with the WHERE clause outside the MERGE statement, it takes just 1 sec to return the data.
    Is there any known issue with MERGE statement while implementing using above scenario?

    riedelme wrote:
    Are your join columns indexed?
    Yes, the join columns are indexed.
    You are doing a remote query inside the merge; remote queries can slow things down. Do you have to select all thr rows from the remote table? What if you copied them locally using a materialized view?Yes, I agree that remote queries will slow things down. But the same is not happening while select, insert and pl/sql. It happens only when we are using MERGE. I have to test what happens if we use a subquery refering to a local table or materialized view. Even if it works, I think there is still a problem with MERGE in case of remote subqueries (atleast till I test local queries). I wish some one can test similar scenarios so that we can know whether it is a genuine problem or some specific problem from my side.
    >
    BTW, I haven't had great luck with MERGE either :(. Last time I tried to use it I found it faster to use a loop with insert/update logic.
    Edited by: riedelme on Jul 28, 2009 12:12 PM:) I used the same to overcome this situation. I think MERGE needs to be still improved functionally from Oracle side. I personally feel that it is one of the robust features to grace SQL or PL/SQL.

  • PSP: problems with viewing data

    Hello.
    I'm currently working at on-line shop and have some problems with viewing data from database. When there is no much inserts to table its working very well. But after inserting all Inserts I have its acting weird.
    Sample with 10 INSERTS:
    http://gafgarion.atspace.com/psp/1.jpg
    Sample with 100 INSERTS:
    http://gafgarion.atspace.com/psp/2.jpg
    I'm using Oracle 9i. when I have more data in my database its acting weird. There is SELECT only from one table, but sometimes I have data from other tables aswell.
    I didnt touch any config files or something else. Only created new User and DAD.
    any ideas what should I do to fix that ??
    thnx in advice

    Hello,
    My guess is that you are speaking about PLSQL Server Pages (PSP), and the PLSQL Web Toolkit.
    This is why I do not think that you will have lot of answer since this forum is targeted toward Web Services developer (XML, SOAP, and so on)
    I am inviting you to ask your question on the general Oracle Application Server - General or PLSQL forums.
    Regards
    Tugdual Grall

  • Problems with Merging Account

    Hi Experts,
    I have 2 problems with merging accounts:
    1) Opportunity is not moved from source data to master data. I do not know why. Central BP is changable.
    2) Contact Person is only copied, not moved. This causes problems later on when we want to delete the source data.
    Any idea how to solve this?
    Best regards,
    Cristina

    Hi Arno:
    First of all: Thanks for answering my thread!
    The problem with opportunities in my system is, that they are not merged at all, not even with the batch job. I checked trx. BUSWU02. Node CRM370 is assigned to variant CLEAR_REP.
    Also the central Business Partner in opportunity remains changable, once the opportunity was saved.
    Anything else I have to check??
    Best regards,
    Cristina

  • Problem with a data set: DIAdem crashes

    Hi,
    I've got a problem with a data set. When I want to zoom in DIAdem-View, DIAdem crashes with the following message (translated from German ;-):
    error type: FLOAT INEXACT RESULT or FLOAT INVALID OPERATION or FLOAT STACK CHECK
    error address: 00016CB8
    module name: gfsview.DLL
    I've got some similar data set not showing such problems. Further on I scanned the data a bit, but in the 59000 points I didn't see anything special. I did try to delete "NOVALUE"s as well, but after that there still exist "NOVALUE"s.
    Does anyone have an idea what to look for?
    Thanks,
    Carsten

    Carsten,
    Could you please upload you Citadel database to the following FTP site:
    ftp.ni.com/incoming
    If you want to compress (ZIP) and/or put a password on the data, that's fine. Please send me a private email at [email protected] (with the file name and password if you put one on the file) once you have uploaded the file and I will check it out.
    Otmar
    Otmar D. Foehner
    Business Development Manager
    DIAdem and Test Data Management
    National Instruments
    Austin, TX - USA
    "For an optimist the glass is half full, for a pessimist it's half empty, and for an engineer is twice bigger than necessary."

  • Problem with input data format - not "only" XML

    Hi Experts,
    I have problem with input data format.
    I get some data from JMS ( MQSeries) , but input format is not clear XML. 
    This is some like flat file with content of XMLu2026.
    Example:
    0000084202008-11-0511:37<?xml version="1.0" encoding="UTF-8"?>
    <Document xmlns="urn:xsd:test.01" xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance Sndr="0001" Rcvr="SP">
    ...content....
    </Document>000016750
    Problems in this file is :
    1. data before parser <? xml version="1.0"> -> 0000084202008-11-0511:37 
    2. data after last parser </Document> -> 000016750
    This data destroy XML format.
    Unfortunately XI is not one receiver of this files and we canu2019t change this file format in queue MQSeries ( before go to XI) .
    My goal is to get XML from this file:
    <?xml version="1.0" encoding="UTF-8"?>
    <Document xmlns="urn:xsd:test.01" xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance Sndr="0001" Rcvr="SP">
    ...content....
    </Document>
    My questions:
    1. Is any way or technique to delete this data 0000084202008-11-0511:37  in XI from this file ?
    2. Is any way to get only XML from this file ?
    Thanx .
    Regards,
    Seo

    Hi Buddy
    What is the XI adapter using?
    If you use inbound File adapter content conversion you could replace these values with none and then pass it to the scenario.
    Does that help?
    Regards
    cK

  • Problem with universal data cleanse

    I have problem with universal data cleanse.
    I am using DS 3.2.x (12.2.2).
    I created:
    Dictionary: TEST
    Classification: TEST_CL
    Custom outputs: category: TEST_CAT
    Dictionary Entry:
    Primary: BO
    Classification: TEST_CL
    Gender: Unassigned
    Secondary information:
    When used as: TEST_CAT
    Standard: Business Objects
    Rule file:
    DataCleanse Rule File v2.0;
    TEST_CAT = TEST_CL;
    action = TEST_CAT;
    TEST_CAT = 1 : TEST_CAT : 1;
    end_action
    Data Cleanse:
    Input: Multiline1
    Options:
    Parsing Dictionary: TEST
    Rule file: .../test_rules.dat
    Break On Whitespace Only: Yes
    Parser Sequences Multiline1: TEST_CAT
    Output:
    Parent_component: TEST_CAT1
    Generated_field_name: TEST_CAT and RULE_LABEL
    Generated_field_class: parsed/standardized
    Contenet_type: none
    and EXTRA field
    I wanted to replace word u201CBOu201D with the standard words u201CBusiness Objectu201D, but I have result: there is "BO" in the extra field,
    there are "null" in others fields.
    What am I doing wrong?
    Thanks for all help!
    P.S. I don't have cleansing packages installer.

    There seem to be a couple of things going on here:
    1. If you are using your custom dictionaries, then you have to map your input to MULTILINE1 and enable the custom parsers - just something to be aware of
    2. You mentioned that you made some changes to the existing dictionary and you are not seeing any changes. To be clear, do you have different TEST and PRODUCTION environments? Or is it the same environment except that you have a local DS repository and for the dictionary you are pointing to another repository (using Dictionary --> Manage Connections)?
    Having the dictionaries on a different repository should not make any difference as long as you point to them in your designer using the Dictionary --> Manage Connection option.
    So I think there may be some issue with your job setup and/or dictionary values need to be looked at. You can start by adding another output field named "EXTRA" to see whether or not your data is getting parsed at all. Also, make sure the entry "CLEANME" is classified as FIRM_NAME_ALONE in the dictionary and that you are selecting the correct dictionary name in the Datacleanse Transform options.

  • I have problem with Cellular data network it is not apearing in iphone setting so help me how to bring this option in iphone

    i have problem with Cellular data network it is not apearing in iphone setting so help me how to bring this option in iphone

    What brand/model USB drive? Is it bus or AC powered?
    On Mail...
    First Quit Mail, then I'd backup these two Mail folders, by right clicking on them in the Finder, then choose Archive/Compress.
    Users/YourUserName/Library/Mail
    Users/YourUserName/Library/Mail Downloads
    (Could be a different folder here if you chose such in Mail Prefs)
    Right click on that Mail folder, choose archive, you'll get everything in the folder, and the folder itself in a file called Mail.zip, move it to a safe place, same for the Mail Downloads folder... only the plist is separate.
    /Users/YourUserName/Library/Preferences/com.apple.mail.plist

  • EMERGENCY! Problem with an .m4a file

    I was recording an episode of my podcast (Radio Allegro- radio-allegro.com) and for some reason my software quit on me. I was using Audio HiJack and recording multiple people in Skype. I have never had a problem before, but when I repopened Audio Hijack Pro it was like I had never recorded anything. When I found the file the .m4a file was obviously corrupted. My question is: is there a way to fix this and if so and with what. If you can answer this question I will publish your name on my site and give you the DVD of your choice. PLEASE HELP!!

    A few questions - (although most likely you're probably screwed):
    What's the file size like? Does it look like the data could be intact? For example - if you think you've recorded an hour of audio - and your file size is only like 150K or something - most likely your data is gone.
    If the filesize looks like it's in order for what you think should be in there...then it could be just a header screwup in the data file. Theoretically - you could get into the file and figure the headers out - adjust them properly - and go from there...
    I've managed to pull that kind of stuff off a few times in the past with corrupted data files - but most often - your stuff is just gone.
    Good luck man!

  • Is anyone else having problems with Apertures Date fields?

    After experiencing a sysems failure I began restoring my images from backups.  I have had no end of problems with Image dates.  I have come to the determination that Aperture is NOT using either the "Create Date" or "DateTimeOriginal" as the image "Date" or "Date Created" data if other fields have other dates.
    Here is a clip from my image metadata using exiftool:
    File Modification Date/Time     : 2012:09:23 19:53:15-04:00
    File Access Date/Time           : 2013:01:16 11:34:38-05:00
    File Inode Change Date/Time     : 2013:01:16 11:31:32-05:00
    Create Date                          : 2006:02:20 11:51:12.10
    Date/Time Original              : 2006:02:20 16:51:12.10
    Modify Date                          : 2006:02:20 11:51:12.10
    And here is what Aperture utilized on import
    Date:                               9/23/12 7:53:15 PM EDT
    Date Created                         9/23/2012 7:53:15 PM
    So Aperture utlized the File Modification Date/Time as the Create Date despite that the fact that the Create Date field is present and is properly formatted in the original image.
    According to the Aperture mapping table, this shouldn't be happening.
    Now, before someone recommends that I use Aperture's Date Adjust utility - I'm talking about  slightly over 30,000 images.  Editing images one at a time, or in blocks when you don't know what field Aperture is using as the Create Date would require individual inspection of each image followed by manual adjustment of each image.  That approach isn't acceptable.

    Hmm ok, let me restate the issue then, I thought I was clear.  I could have pointed out in my original post however, that only the EXIF and IPTC date fields were displayed from the metadata dump.
    The data set presented in my first post is the EXIF dataset from the file which clearly shows the image was captured by digital camera at
    Create Date                          : 2006:02:20 11:51:12.10
    Date/Time Original              : 2006:02:20 16:51:12.10
    (The delta of 5 hours is the result of Zulu versus local time offset)
    But, when I ingested it into Aperture, the import routine utilized the IPTC field
    File Modification Date/Time     : 2012:09:23 19:53:15-04:00
    which is updated by the OS whenever you move the file around outside of Aperture. (a number of image data fields are updated by the OS - filename for example is another.).  This resulted in the image date fields being stamped in Aperture as:
    Date:                               9/23/12 7:53:15 PM EDT
    Date Created                         9/23/2012 7:53:15 PM
    Which obviously came form the File Modification Date/Time field and not (either) the Create Date or Date/Time Original fields.

  • Merge Data Files into Spreadsheet not working

    HELP! I created the form files in Adobe Acrobat Pro XI, saved as Reader Extended PDF, have a submit button that emails me, and I download them to a folder on my desktop. When I try to "Merge Data Files into Spreadsheet" all I get back is the file name. I've tried with 10 files and only one file and it STILL won't work so it is not that the fields don't line up.
    What do I do now?

    It's hard to say without looking at one of the files. If you can't post one somewhere, I'd be happy to take a look if you're free to email me: acroscript at gmail dot com

Maybe you are looking for