Importing previously exported data (from local file)

After having searched to no end I still cannot find the answer to my question....
Our current environment is an SAP and FLM system (forms livecycle management) using livecycle designer for design and all users on at least reader 9.
I would like to know if it is possible for a user to fill in a form and save the entered data (to a local file) so the following week they can then open latest version of the PDF (requested from the FLM portal to be either completed online (portal) or offline (email)) they would then import the data, make any minor updates and submit. The forms should have reader extensions applied but as yet unsure of the exact settings (if any) as this happens automatically within FLM. The form shouldn’t change in future, more than some additional validation, but allow form numbers and versions within FLM to be better maintained. Plus to ensure users do not keep or use outdated forms.
From my understanding there are 2 ways to import data, either folder level javascript or via certifying the form. For our setup folder level scripts are a non-starter as would have no way to maintain these on every PC (far to many) so that leave us with certifying forms....from the limited information I have found on this it appears that the certification may break when importing data, have very very basically tested this and appears to be true.
So the last thought was if I can import data programmatically via javascript and fill fields in etc would this still have the same affect of breaking certification?
Appreciate any advice

> Hi Srdjan.
>
> Are you familiar with the MDM Import Manager?
>
> Best regards,
> Nir
Hi
I haven't tried the MDM Import manager..but I solved it with LSMW. I created a recording for one record entry, then I specified the rest of the data in a text file, and passed it to LSMW..It worked (almost) perfectly!
Thanks for the suggestion though, I'll give it a try some other time
Best regards,
S.

Similar Messages

  • Import data from local file: insufficient privilege

    As part of a project I'm working on, I have to load data (only a few thousand records) into a HANA Development instance on HCP. Last week I did this several times with no problem using File > Import from HANA Studio and choosing Data From Local File. Yesterday and today I get this message (most of the time, but not all the time): SAP DBTech JDBC: [258]: insufficient privilege: Not authorized.
    Does anybody know why this might be and what do to about it?
    Are there batch alternatives? I have not found any. The INPUT INTO statement also gives authorization problems.

    Thanks for Your replies.
    Jochen - I can't find schema.ini file and your link doesn't work - I mean that site is under construction
    Mahesh - I tried to changed cells' format in excel, but file is saved without changes (as csv).
    Pandey - I read about DTW but I can't find it anywhere. Is there a site where it could be downloaded from?
    Finally I solved this problem. After saving file as xls I could changed cells' format and save with changes. It helped - data '1-10-103' is imported correctly.
    Regards,
    Hmg

  • Do tree have the standard functoin to export data to local file ?

    Hi:
    do tree have the standard functoin to export data to local file? you know ALV have the standard button to do it.
    thanksss.
    qimingxing.

    I don't think there is a method built into the tree object for doing this.
    [SAP Tree and Tree Model|http://help.sap.com/saphelp_47x200/helpdata/EN/b7/147a36c70d2354e10000009b38f839/frameset.htm]
    I think one way you could do it is have an option on the application menu, then manually parse the data in the tree and build up an itab yourself from the expanded nodes (or all nodes or whatever). Then you could export the data from the itab.

  • While uploading data from local file

    Hi,
    While uploading data from local file having the header text into interal table using the GUI_UPLOAD function module,there is problem in upload because of header text in file, do we have any option in this FM itself to skip this header text and upload from sdecond line because as per our requirement we can't delete this header text from the file.

    Hi Manish,
    Do you have the problem while uploading? Is it giving any error?
    If not, if your uploading is successful into the internal table, then delete the first row, which is header, from the internal table using index eq 1.
    Regards,
    Chandra Sekhar

  • Exporting data from text file to a table using utl_file

    Dear all,
    I have a text file as below and i have a table having 12 columns. Now i need to insert this text file into the table story_books.
    CREATE TABLE story_books
    book_id NUMBER,
    Category VARCHAR2(100 BYTE),
    Book_type VARCHAR2(100 BYTE),
    Name VARCHAR2(700 BYTE),
    Location VARCHAR2(700 BYTE),
    Ownership_code VARCHAR2(700 BYTE),
    Author VARCHAR2(700 BYTE),
    Less_Sel_fact VARCHAR2(700 BYTE),
    Reason VARCHAR2(700 BYTE),
    Buying VARCHAR2(700 BYTE),
    Suspected Book VARCHAR2(700 BYTE),
    Conditions VARCHAR2(700 BYTE)
    -------------------------text file---------------
    Books Out Table: Books
    Book. Type          Name          Location               Ownership Code
    Story               SL          hyd               SS-HYD
    Known Author:     Unknown               
    Less Selling Factors: Thunderstorms     
    Reason:     Unknown               
    Buying (if applicable):
    Not Applicable
    Suspected Book:
    Unknown
    Conditions to increace sales:
    Advertisement in all areas
    i was able to read the data and storing if it is in the same line.But i dont know how to read below data
    Book. Type          Name          Location               Ownership Code
    Story               SL          hyd               SS-HYD
    In this data i have to search for 'Book. type' and then i need to save the word 'Story' to the column 'Book_type'
    Then i need to search for 'Name' and i need to save 'SL' into the column into 'Name'
    Then i need to search for 'Location' and i need to save 'hyd' into the column into 'Location'
    I was able to extract the data if it is in below format using utl_file.get_line
    Known Author:     Unknown               
    Less Selling Factors: Thunderstorms     
    Reason:     Unknown     
    Any one can explain me how to solve the above criteria.
    Thanks in advance.

    Dear all,
    I have a text file as below and i have a table having 12 columns. Now i need to insert this text file into the table story_books.
    CREATE TABLE story_books
    book_id NUMBER,
    Category VARCHAR2(100 BYTE),
    Book_type VARCHAR2(100 BYTE),
    Name VARCHAR2(700 BYTE),
    Location VARCHAR2(700 BYTE),
    Ownership_code VARCHAR2(700 BYTE),
    Author VARCHAR2(700 BYTE),
    Less_Sel_fact VARCHAR2(700 BYTE),
    Reason VARCHAR2(700 BYTE),
    Buying VARCHAR2(700 BYTE),
    Suspected Book VARCHAR2(700 BYTE),
    Conditions VARCHAR2(700 BYTE)
    -------------------------text file---------------
    Books Out Table: Books
    Book. Type          Name          Location               Ownership Code
    Story               SL          hyd               SS-HYD
    Known Author:     Unknown               
    Less Selling Factors: Thunderstorms     
    Reason:     Unknown               
    Buying (if applicable):
    Not Applicable
    Suspected Book:
    Unknown
    Conditions to increace sales:
    Advertisement in all areas
    i was able to read the data and storing if it is in the same line.But i dont know how to read below data
    Book. Type          Name          Location               Ownership Code
    Story               SL          hyd               SS-HYD
    In this data i have to search for 'Book. type' and then i need to save the word 'Story' to the column 'Book_type'
    Then i need to search for 'Name' and i need to save 'SL' into the column into 'Name'
    Then i need to search for 'Location' and i need to save 'hyd' into the column into 'Location'
    I was able to extract the data if it is in below format using utl_file.get_line
    Known Author:     Unknown               
    Less Selling Factors: Thunderstorms     
    Reason:     Unknown     
    Any one can explain me how to solve the above criteria.
    Thanks in advance.

  • Problem in ALV export to Excel (Local File)

    Hi guys,
    I'm using the option export data do local file, option excel, but some strange behave is happening...
    Some columns are exported without the last digit...
    Ideas?
    Regards,
    LMS

    Hi LMS,
    Use this link.
    Export to excel in ALV
    Problem with Export to Excel from ALV Grid
    Hope this will help you to resolve your problem.
    Regards,
    Vijay

  • Unable to export data from Web Access Data Sheet in Sharepoint to local excel or access file

    Greetings and good morning.
    I'm going to start off in broad terms with this question because I'm not 100 percent sure what information to provide.
    Long story short, we've got a Web Access Data Sheet list hosted in a Sharepoint 2010 environment. It is accessed and used by multiple people throughout the day. It contains several thousand line item entries. I'd call it a large data sheet.
    I think the size of the data sheet is casuign some instability in the list. I'd like to be able to export a defined range of data from the list into a local excel or access file. After that, I'd delete the stuff on the Access list to improve performance.
    But...when I attempt to use the Sharepoint Action bar to export - Excel locks up/crashes. If I try to export to Access, I get a similar issue.
    Any ideas? Could anyone begin by telling me what other information is required?

    Hi,
    If you would like to export data from Access Web Database in SharePoint 2010, you could go to Design With Access page in Settings. The url in my environment is http://sp/tt/_layouts/accsrv/ModifyApplication.aspx . Then choose the Table and export it to Excel
    or Modify it in Access.
    Regards,
    Rebecca Tu
    TechNet Community Support

  • Import/export data from ms sql 7 to xml file in a particular xpdl format

    I'm trying to write java program to import/export data from ms sql 7 to xml file in a particular xpdl format. Can anyone suggest how to do it?

    take a glance at what these guys do: http://www.openbusinessengine.org/docs/guide.html

  • How to import data from CSV file with columns separated by semicolon?

    I migrate database from MS SQL 2008 to ORACLE 11g
    I export data to CSV file from MS SQL
    then I try to import it to Oracle
    several tables goes fine using Import data option in the SQL Developer
    Standard CSV files with data separated by comma were imported.
    chars, date (with format string), and integer data are imported via import wizard without problems
    the problems were when I try to import table with noninteger numbers with modal part separated by comma not by dot
    comma is the standard separator for columns in CSV file
    so I must change the standard separator to semicolon
    then the import wizard have problem to correct recognize the columns data because it use only standard CSV comma separator :-/
    In SQL Developer 1.5.3 Tools -> Preferences -> Migration -> Data Move Options
    I change "End of Column Delimiter" to ; but it doens't work
    Is this possible to change the standard column separator for import data wizzard in SQL Developer 1.5.3?
    Or maybe someone know how to import data in SQL Developer 1.5.3 from CSV when CSV colunn separator is set to semicolon?

    A new preference has been added to customize the import delimiter in main code line. This should be available as part of future release.

  • Importing/Exporting data from Microsoft Access

    Hello to everybody!
    I have to create a management software (client-server).
    CLIENT SIDE: ClientS insert data into forms and these data are stored into a database Access (client database). Every month clients send to the server some of the data stored (data of the month!).
    SERVER SIDE: "Server man" receive from clients those data by email end need to store them into the main server database.
    Note: Data to be send have not a big dimension and, naturally, clients and server databases are identical (same structure)!
    My questions are:
    1) What is the best way to export data from database access to a file? (I imagine I need to use .XML file). Do I need to use particular package or software??
    2) What is the best way to import data from a file (maybe .XML) into a database access (of server)?? Do I need to use particular package or software??
    This is my first software of this kind.
    Please HELP ME!
    Thanks in advance, Liuk.

    1) What is the best way to export data from database
    access to a file?That depends on many things.
    (I imagine I need to use .XML
    file). NO! That's not what XML is for
    Do I need to use particular package or
    software??Again, that depends on a lot of things.
    2) What is the best way to import data from a file
    (maybe .XML) into a database access (of server)?? Do
    I need to use particular package or software??Look at Java IO packages for reading data from files. Then look at JDBC for writing data to databases. Avoid XML, you don't need it.

  • Using export/import to migrate data from 8i to 9i

    We are trying to migrate all data from 8i database to 9i database. We plan to migrate the data using export/import utility so that we can have the current 8i database intact. And also the 8i and 9i database will reside on the same machine. Our 8i database size is around 300GB.
    We plan to follow below steps :
    Export data from 8i
    Install 9i
    Create tablespaces
    Create schema and tables
    create user (user used for exporting data)
    Import data in 9i
    Please let me know if below par file is correct for the export :
    BUFFER=560000
    COMPRESS=y
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=y
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE
    TRIGGERS=y
    TTS_FULL_CHECK=TRUE
    Thanks,
    Vinod Bhansali

    I recommend you to change some parameters and remove
    others:
    BUFFER=560000
    COMPRESS=y -- This will increase better storage
    structure ( It is good )
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=n -- if you set that parameter in yes you
    can have problems with some objects
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y -- this value is the default ( It is
    not necesary )
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y -- ( start the database in restrict
    mode and do not set this param )
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE -- this value is the default ( It is
    not necesary )
    TRIGGERS=y -- this value is the default ( It is
    not necesary )
    TTS_FULL_CHECK=TRUE
    you can see what parameters are not needed if you apply
    this command:
    [oracle@ozawa oracle]$ exp help=y
    Export: Release 9.2.0.1.0 - Production on Sun Dec 28 16:37:37 2003
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    You can let Export prompt you for parameters by entering the EXP
    command followed by your username/password:
    Example: EXP SCOTT/TIGER
    Or, you can control how Export runs by entering the EXP command followed
    by various arguments. To specify parameters, you use keywords:
    Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
    Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
    or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword Description (Default) Keyword Description (Default)
    USERID username/password FULL export entire file (N)
    BUFFER size of data buffer OWNER list of owner usernames
    FILE output files (EXPDAT.DMP) TABLES list of table names
    COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
    GRANTS export grants (Y) INCTYPE incremental export type
    INDEXES export indexes (Y) RECORD track incr. export (Y)
    DIRECT direct path (N) TRIGGERS export triggers (Y)
    LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
    ROWS export data rows (Y) PARFILE parameter filename
    CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
    OBJECT_CONSISTENT transaction set to read only during object export (N)
    FEEDBACK display progress every x rows (0)
    FILESIZE maximum size of each dump file
    FLASHBACK_SCN SCN used to set session snapshot back to
    FLASHBACK_TIME time used to get the SCN closest to the specified time
    QUERY select clause used to export a subset of a table
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    TTS_FULL_CHECK perform full or partial dependency check for TTS
    VOLSIZE number of bytes to write to each tape volume
    TABLESPACES list of tablespaces to export
    TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
    TEMPLATE template name which invokes iAS mode export
    Export terminated successfully without warnings.
    [oracle@ozawa oracle]$
    Joel P�rez

  • Import data from text file

    We are trying to import data from a text file into fields in a form using the importTextData method for the Doc object. The script we are using looks like this:
    var doc = event.target;
    var returnCode = doc.importTextData("datafile.txt",0);
    The text file "datafile.txt" contains tab separated values in a header row (which corresonds to field names in the form) followed by tab separated datarows - all according to instructions in the API.
    The returnCode will result in an integer value, which in our case represent "Error: Invalid Row". It would help us a lot if anyone could give us a hint on what may be wrong or even better post a working solution.
    We have also tried using the importTextData method without any parameters (the user is then prompted to select the text file and then the specific row) but the result remains the same. The fields in the form are not populated with data and the message "Error: Invalid Row" is returned.
    Annika Lindqvist

    The problem you're experiencing is that importTextData is an Acrobat API call. As such, it executes on the AcroForm field equivalents to the XFA fields you've placed on the form. What's not indicated in the API for importTextData is that the names in the fields in the header of the text file must be the
    full AcroForm SOM expression.
    I've created a working sample to explain this.
    First, I created a form with a table with one row in it containing 3 columns named FirstName, LastName and Country.
    Then I created a data.txt file like this:
    FirstName LastName Country
    Fred Jones USA
    Kyle Francis Canada
    Sam Roberts UK
    I then placed a button on the form which calls "event.target.importTextData();" and ran the form in Preview.
    Of course, when I selected a row for import, nothing happened.
    I then used the "Avanced | Forms | Export Data from Form..." menu option in Acrobat after filling-in the fields in the table with some text and looked at the generated text file (note that you have to specify text as the output format). In there, I was able to figure-out what the full AcroFrom SOM expression was for each field I wanted to import data into.
    I changed my data file to look like this:
    form1[0].#subform[0].Table1[0].Row1[0].FirstName[0] form1[0].#subform[0].Table1[0].Row1[0].LastName[0] form1[0].#subform[0].Table1[0].Row1[0].Country[0]
    Fred Jones USA
    Kyle Francis Canada
    Sam Roberts UK
    Afther that, importTextData worked as expected without any errors.
    Stefan
    Adobe Systems

  • Import data from text file to cube

    Hi,
    I'm trying to load data from a text file to a cube.
    Just as an experiment, I exported data from a cube into a file. Then wrote a script to import the same file back into the cube - It works fine.
    After this i deleted some time members from the outline and tried loading - but it errored out -
    ERROR MESSAGE : Unknown Member [2007-11-06] in Data Load, [22] Records Completed.
    I want the valid records to be loaded and the invalid ones rejected but this is not happening.
    This is the command i use:
    IMPORT 3 "/PATH/Filename.txt" 4 "n" 2 "IMP001.rul" "n" "/PATH/error.txt";
    I'm not using a rule file for this as it was not working fine with a rule file(IMP001).
    Please let me know how i can proceed with the import by just rejecting the records where members are not found in the outline.
    Thanks in advance,
    Rags.

    Yes, the question wasn't intended to recommend it as a solution, but a recurring need for this type of manipulation could point to a more robust answer without making it overly complex.
    The "end version" of what I envisioned included a temporary cube that included all members ever loaded, and UDA's assigned to the "current set", for use in an export/import process. The export would be the source for the rebuild of the production cube. This end vision seemed a little on the complex side, but it would be robust and work long term. There are obviously different ways to simplify this approach.
    The simplification that my question was aimed for is to eliminate the use of a template outline from a theoretical approach I've used in the past (in a round about way). In essence, if you can leave the members in the production cube and just clear the data, you could use a flag value that is loaded to create a D-C value (say, on scenario) to perform the extract. Loading a "1" would generate output, and a "0" to suppress output (using "scenario" * "flag" for the report value). Deleting the member is thus not needed but done "virtually" with the flag on the output side.
    I know, my crazy thinking is not easy to follow, even for me -- but thinking out of the box serves me well most of the time.

  • Export data from a table to text file using srcipt task

    Hi
    i am new to SSIS
    i have to export data from a table and append it into a existing file through SSIS script task
    please help
    Thanks
    Umesh

    Hi Umesh,
    The data structure of the source table and the structure of the destination file are the same, right? Is the destination file a flat file? Do you have to do it through Script Task? If the destination file is a flat file, this can be done easily by using
    the stock tasks/components other than .NET code. In the Data Flow Task, we choose the appropriate source adapter (such as OLE DB Source or ADO.NET Source) to extract data from the source table, perform transformation if necessary, and then load to the destination
    file via a Flat File Destination. When setting up the Flat File Destination, uncheck the “Overwrite data in the file” option so that the extracted data will be appended to the existing file.
    If you need to implement it through Script Task/Component indeed, you may benefit from the following code examples:
    http://stackoverflow.com/questions/8070163/how-to-add-custom-footer-to-an-ssis-flat-file-seperate-component-or-script-tas 
    http://stackoverflow.com/questions/8467326/add-header-and-footer-row-flat-file-ssis 
    If you need further help about the script, I suggest that you ask a new question in .NET forums where you can get more dedicated support:
    http://social.msdn.microsoft.com/Forums/vstudio/en-US/home?category=netdevelopment 
    Regards,
    Mike Yin
    TechNet Community Support

  • How to export data from a Oracle table to a delimited file?

    I know how to load delimited file into a table, but how to export
    data from a Oracle table to a delimited file?
    Thanks in advance.

    Try looking at this link, it's long but there's three different solutions discussed in it. If you look at Barbara Boehmer's
    solution in her posts in the link below you'll see that she's addressed your concerns with spool files.
    Re: utl_smtp and triggers

Maybe you are looking for