Exporting data clusters with type version

Hi all,
let's assume we are saving some ABAP data as a cluster to database using the IMPORT TO ... functionality, e.g.
EXPORT VBAK FROM LS_VBAK VBAP FROM LT_VBAP  TO DATABASE INDX(QT) ID 'TEST'
Some days later, the data can be imported
IMPORT VBAK TO LS_VBAK VBAP TO LT_VBAP FROM DATABASE INDX(QT) ID 'TEST'.
Some months or years later, however, the IMPORT may crash: Since it is the most normal thing in the world that ABAP types are extended, some new fields may have been added to the structures VBAP or VBAK in the meantime.
The data are not lost, however: Using method CL_ABAP_EXPIMP_UTILITIES=>DBUF_IMPORT_CREATE_DATA, they can be recovered from an XSTRING. This will create data objects apt to the content of the buffer. But the component names are lost - they get auto-generated names like COMP00001, COMP00002 etc., replacing the original names MANDT, VBELN, etc.
So a natural question is how to save the type info ( = metadata) for the extracted data together with the data themselves:
EXPORT TYPES FROM LT_TYPES VBAK FROM LS_VBAK VBAP FROM LT_VBAP TO DATABASE INDX(QT) ID 'TEST'.
The table LT_TYPES should contain the meta type info for all exported data. For structures, this could be a DDFIELDS-like table containing the component information. For tables, additionally the table kind, key uniqueness and key components should be saved.
Actually, LT_TYPES should contain persistent versions of CL_ABAP_STRUCTDESCR, CL_ABAP_TABLEDESCR, etc. But it seems there is no serialization provided for the RTTI type info classes.
(In an optimized version, the type info could be stored in a separate cluster, and being referenced by a version number only in the data cluster, for efficiency).
In the import step, the LT_TYPES could be imported first, and then instances for these historical data types could be created as containers for the real data import (here, I am inventing a class zcl_abap_expimp_utilities):
IMPORT TYPES TO LT_TYPES FROM DATABASE INDX(QT) ID 'TEST'.
DATA(LO_TYPES) = ZCL_ABAP_EXPIMP_UITLITIES=>CREATE_TYPE_INFOS ( LT_TYPES ).
assign lo_types->data_object('VBAK')->* to <LS_VBAK>.
assign lo_types->data_object('VBAP')->* to <LT_VBAP>.
IMPORT VBAK TO <LS_VBAK> VBAP TO <LT_VBAP> FROM DATABASE INDX(QT) ID 'TEST'.
Now the data can be recovered with their historical types (i.e. the types they had when the export statement was performed) and processed further.
For example, structures and table-lines could be mixed into the current versions using MOVE-CORRESPONDING, and so on.
My question: Is there any support from the standard for this functionality: Exporting data clusters with type version?
Regards,
Rüdiger

The IMPORT statement works fine if target internal table has all fields of source internal table, plus some additional fields at the end, something like append structure of vbak.
Here is the snippet used.
TYPES:
BEGIN OF ty,
  a TYPE i,
END OF ty,
BEGIN OF ty2.
        INCLUDE TYPE ty.
TYPES:
  b TYPE i,
END OF ty2.
DATA: lt1 TYPE TABLE OF ty,
      ls TYPE ty,
      lt2 TYPE TABLE OF ty2.
ls-a = 2. APPEND ls TO lt1.
ls-a = 4. APPEND ls TO lt1.
EXPORT table = lt1 TO MEMORY ID 'ZTEST'.
IMPORT table = lt2 FROM MEMORY ID 'ZTEST'.
I guess IMPORT statement would behave fine if current VBAK has more fields than older VBAK.

Similar Messages

  • TIFF export date issue with 40D images

    Hi.
    I have a strange issue with Aperture 1.5.6, Leopard 10.5 and Canon 40D RAW images.
    The problem is this:
    1) Import a 40D RAW image in aperture.
    2) The exif image date is 13.11.07 20:21:10.
    3) Then edit the image in Adobe Photoshop CS3 (TIFF).
    4) Now we have two version of the image: the original RAW file and a TIFF file.
    5) Export the two images with aperture to JPEG files.
    Now the exif date/time of the image exported (RAW) is 13.11.07 20:21:10 and that's correct.
    The problem is that the time of the second image (TIFF) has exactly two more hour ( 13.11.07 22:21:10 ).
    In aperture the date/time is correct for both the images.
    Someone can please try to replicate this issue?

    Thanks for your reply.
    1. I select the photos from iPhoto - varied between 5 and 20 (ran different test)
    2. Click on FILE
    3. Select EXPORT
    4. Under the FILE EXPORT tab I have tried all the various options with no success
    5. KIND - I make sure they are JPEG files - I have tried the Original also which is actually JPEG files
    6. JPEG QUALITY - I have tried all 4 options from LOW to MAX
    7. SIZE - I tried all 4 size options from SMALL to FULL
    8. FILE NAME - I use filename
    9. Select EXPORT tab
    10. I export to the thumbdrive and I chcek and they are on the thumbdrive
    11. I also tried to EXPORT to the desktop and copy them to the thumbdrive
    I click on the photos on the exported images I placed on the thumb drive and they come up fine on PREVIEW on my MAC. No prroblem with the images until I try to use them on the plasma. I go back and use images from iPhoto 06 that I did not call up with the "08" program and they work fine.
    Hope that explains what I do. Thanks again

  • Exporting data of activity type 'FormSubmit' failed

    Hi all,
    I'm trying to export activity data out by Bulk APIs.
    But when using activity type 'FormSubmit', I get Sync status 'error' with message as following:
         [{"count":0,"createdAt":"\/Date(1426236500847)\/","message":"There was an error processing the export.","severity":"error","statusCode":"ELQ-00107","syncUri":"/syncs/64770"},{"syncUri":"/syncs/64770","count":0,"severity":"information","statusCode":"ELQ-00101","message":"Sync processed for sync 64770, resulting in Error status.","createdAt":"\/Date(1426236500847)\/"}]
    The other activity types works well.
    Any suggestions?
    Thanks,
    Biao

    Hi,
    I had my Sync log in above description. It is this:
    [{"count":0,"createdAt":"\/Date(1426236500847)\/","message":"There was an error processing the export.","severity":"error","statusCode":"ELQ-00107","syncUri":"/syncs/64770"},{"syncUri":"/syncs/64770","count":0,"severity":"information","statusCode":"ELQ-00101","message":"Sync processed for sync 64770, resulting in Error status.","createdAt":"\/Date(1426236500847)\/"}]
    Thanks,
    Biao

  • Error on Data Flow Task MSSQL 2012 Clustered "Description: The version of Lookup is not compatible with this version of the DataFlow. "

    We have an SSIS package that runs on clustered MSSQL 2012 Enterprise Nodes that is failing.  We use a job to executer the package.
    Environmental information:
    Product - Microsoft SQL Server Enterprise: Core-based Licensing (64-bit)
    Operating System - Microsoft Windows NT 6.1 (7601)
    Patform - NT x64
    Version - MSSQL Version 11.0.3349.0
    Package is set to 32 -bit.  All permissions verified.  Runs in lower environments, same MSSQL version.  All environments are clustered.  In the failing environment, all nodes are at the same service pack.  I have not verified if all
    nodes in the failing environment have SSIS installed.  Data access is installed.  We have other simpler packages that run in this environment, just not this one.  Time to ask the community for help!
    Error:
    Source: Data Flow Task - Data Flow Task (SSIS.Pipeline)     Description: The version of Lookup is not compatible with this version of the DataFlow.  End Error  Error:  Code: 0xC0048020    
    Description: Component "Conditional Split, clsid {7F88F654-4E20-4D14-84F4-AF9C925D3087}" could not be created and returned error code 0x80070005 "Access is denied.". Make sure that the component is registered correctly.  End Error 
    Description: The component is missing, not registered, not upgradeable, or missing required interfaces. The contact information for this component is "Conditional Split;Microsoft Corporation; Microsoft SQL Server; (C) Microsoft Corporation; All Rights
    Reserved; http://www.microsoft.com/sql/support;0".  End Error 
    (Left out shop specific information.  This is the first error in the errors returns by the job history for this package. )
    Thanks in advance.

    Hi DeveloperMax,
    According to your description, the error occurs when you execute the package with Agent job on clustered MSSQL 2012 Enterprise Nodes.
    As per my understanding, I think this issue can be caused by you use SQL Server Agent to schedule a SQL Server Integration Services package in a 64-bit environment. And the SSIS package is referencing some 32-Bit DLL or 32-Bit drivers which are available
    only in 32-bit versions, so the job failed.
    To fix this issue, we should use the 32-bit version of the DTExec.exe utility to schedule the 64-bit SQL Server Agent to run a package. To run a package in 32-bit mode from a 64-bit version of SQL Server Agent, we can go to the Job Step dialog box, then
    select “32 bit runtime” in the Advanced tab.
    Besides, we should make sure that SQL Server Integration Services is installed on the failing environment.
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • UDI-00018: Data Pump client is incompatible with database version 11.2.0.1

    Hi
    I am trying to import data in Oracle 11g Release2(11.2.0.1) using impdp utitlity and getting below errror
    UDI-00018: Data Pump client is incompatible with database version 11.2.0.1.0
    Export dump has taken in database with oracle 11g Release 1(11.1.0.7.0) and I am trying to import in higher version of the database. Is there any parameter I have to set to avoid this error?

    AUTHSTATE=compat
    A__z=! LOGNAME
    CLASSPATH=/app/oracle/11.2.0/jlib:.
    HOME=/home/oracle
    LANG=C
    LC__FASTMSG=true
    LD_LIBRARY_PATH=/app/oracle/11.2.0/lib:/app/oracle/11.2.0/network/lib:.
    LIBPATH=/app/oracle/11.2.0/JDK/JRE/BIN:/app/oracle/11.2.0/jdk/jre/bin/classic:/app/oracle/11.2.0/lib32
    LOCPATH=/usr/lib/nls/loc
    LOGIN=oracle
    LOGNAME=oracle
    MAIL=/usr/spool/mail/oracle
    MAILMSG=[YOU HAVE NEW MAIL]
    NLSPATH=/usr/lib/nls/msg/%L/%N:/usr/lib/nls/msg/%L/%N.cat
    NLS_DATE_FORMAT=DD-MON-RRRR HH24:MI:SS
    ODMDIR=/etc/objrepos
    ORACLE_BASE=/app/oracle
    ORACLE_HOME=/app/oracle/11.2.0
    ORACLE_SID=AMT6
    ORACLE_TERM=xterm
    ORA_NLS33=/app/oracle/11.2.0/nls/data
    PATH=/app/oracle/11.2.0/bin:.:/usr/bin:/etc:/usr/sbin:/usr/ucb:/home/oracle/bin:/usr/bin/X11:/sbin:.:/usr/local/bin:/usr/ccs/bin
    PS1=nbsud01[$PWD]:($ORACLE_SID)>
    PWD=/nbsiar/nbimp
    SHELL=/usr/bin/ksh
    SHLIB_PATH=/app/oracle/11.2.0/lib:/usr/lib
    TERM=xterm
    TZ=Europe/London
    USER=oracle
    _=/usr/bin/env

  • Problem with Import and Export Data Wizard

    Downloaded and installed SQL Server Express 2008 R2 today because I want to explore how Access interacts with SQL Server (using my home computer). I'm using Access 2010 (under Windows 7), so the 2008 version of SQL Server Express seemed to be the version
    to use.
    After a couple of false starts, installation appeared to go okay. After the installation. My Start menu listed Microsoft SQL Server 2008 and Microsoft SQL Server 2008 R2. The latter listed Import and Export Data (64-bit). When I clicked that, the first Import
    and Export Data Wizard page was displayed. I wasn't ready at that time to explore the wizard, so I closed it. An hour or so later I again attempted to open the Import and Export Data wizard. This time, the wizard didn't open. Instead this error message was
    displayed: "The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered."
    I found DTS.dll on my computer at C:\Program Files\Microsoft SQL Server\100\DTS\Binn, so the file is available, but don't know whether it is registered.
    How can I correct this problem?

    First can you please post all log file errors
    >> I can't really give you a solution or specific recommendation since I did not saw this error yet myself, but on your own risk you can try:
    1. You may try to just register 'dts.dll' using regsvr32.exe, but this error may indicate a bigger problem with setup.
    If you are running SQL Server 64bit then try running this at the command prompt: %windir%\syswow64\regsvr32 "%ProgramFiles(x86)%\Microsoft SQL Server\90\dts\binn\dts.dll"
    2. You can try reinstall from start (In this case you have to make sure that you un-install all)
    [Personal Site] [Blog] [Facebook]

  • Help needed with Export Data Pump using API

    Hi All,
    Am trying to do an export data pump feature using the API.
    while the export as well as import works fine from the command line, its failing with the API.
    This is the command line program:
    expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
    Could you help me how should i achieve the same as above in Oracle Data Pump API
    DECLARE
    h1 NUMBER;
    h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
    dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
    dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
    dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
    END;
    Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"

    Yes, I have read the Oracle doc.
    I was able to proceed as below: but it gives error.
    ============================================================
    SQL> SET SERVEROUTPUT ON SIZE 1000000
    SQL> DECLARE
    2 l_dp_handle NUMBER;
    3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    4 l_job_state VARCHAR2(30) := 'UNDEFINED';
    5 l_sts KU$_STATUS;
    6 BEGIN
    7 l_dp_handle := DBMS_DATAPUMP.open(
    8 operation => 'EXPORT',
    9 job_mode => 'TABLE',
    10 remote_link => NULL,
    11 job_name => '1835_XP_EXPORT',
    12 version => 'LATEST');
    13
    14 DBMS_DATAPUMP.add_file(
    15 handle => l_dp_handle,
    16 filename => 'x1835_XP_EXPORT.dmp',
    17 directory => 'DATA_PUMP_DIR');
    18
    19 DBMS_DATAPUMP.add_file(
    20 handle => l_dp_handle,
    21 filename => 'x1835_XP_EXPORT.log',
    22 directory => 'DATA_PUMP_DIR',
    23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    24
    25 DBMS_DATAPUMP.data_filter(
    26 handle => l_dp_handle,
    27 name => 'SUBQUERY',
    28 value => '(where "XP_TIME_NUM > 1204884480100")',
    29 table_name => 'ldev_perf_data',
    30 schema_name => 'XPSLPERF'
    31 );
    32
    33 DBMS_DATAPUMP.start_job(l_dp_handle);
    34
    35 DBMS_DATAPUMP.detach(l_dp_handle);
    36 END;
    37 /
    DECLARE
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
    ORA-06512: at line 25
    ============================================================
    i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
    value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
    However, the below snippet works fine.
    ============================================================
    SET SERVEROUTPUT ON SIZE 1000000
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'ldev_may20',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.dmp',
    directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.log',
    directory => 'DATA_PUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    ============================================================
    I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
    Any help is highly appreciated.

  • Problem with delta update on customised export data source.

    Hi all BW gurus,
    I have created several customised export data sources in the R/3 system. And I have made use of the utility program <Z_CHANGE_DELTA_PROCESS> to update the ROOSOURCE table.
    While this can result in being able to do initial update (instead of full update), the subsuquent delta update is not working.
    In RSA7, the datasource is with green light and in RSA3, the extractor can extractor data out of it.
    Does anybody have some idea on this?
    I heard that there is a way to change the Business Transaction Event (BTE), but I don't quite understand how this is performed. And wonder if this is the only way out. To me, the triggering point would be the same, meaning the InfoPackage would be executed, then call the extractors in R/3 to extract the delta update in order to load into the InfoCube.
    And how do others do for executing delta update of the customised data source?
    Thanks a lot!!
    The utility program of <Z_CHANGE_DELTA_PROCESS>:
    =================================================
    report z_change_delta_process .
    *P_DATAS DataSource
    *P_DELTAP Delta Process for DataSource
    parameters:
      p_datas type roosource-oltpsource,
      p_deltap type roosource-delta.
    tables:
      roosource.
    data:
      ls_roosource type roosource.
      if p_datas(2) ne 'Z_'.
       message 'The DataSource needs to begin with ''Z_''.'  type 'E'.
      endif.
    select single * from roosource into ls_roosource
           where oltpsource = p_datas
           and objvers = 'A'.
    if sy-subrc eq 0.
       ls_roosource-delta = p_deltap.
       update roosource from ls_roosource.
    message 'The DataSource has been updated successfully.' type 'I'.
    else.
       message 'The DataSource entered is not valid, try again.' type 'E'.
    endif.

    Doesn't anyone have any idea on this topic?
    Appreciate for all the help. Thanks.

  • Schema export via Oracle data pump with Database Vault enabled question

    Hi,
    I have installed and configured Database Vault on an Oracle 11g-r2-11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a realm. I have followed the following doc:
    http://www.oracle.com/technetwork/database/security/twp-databasevault-dba-bestpractices-199882.pdf
    to ensure that the sys and the system user has sufficient rights to complete a schedule Oracle data pump export operation.
    I.e. I have granted to sys and system the following:
    execute dvsys.dbms_macadm.authorize_scheduler_user('sys','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_scheduler_user('system','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user('sys','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user('system','SCHEMA_NAME');
    I have also create a second realm on the same schema (SCHEMA_NAME) to allow sys and system to maintain indexes for real-protected tables, To allow a sys and system to maintain indexes for realm-protected tables. This separate realm was created for all their index types: Index, Index Partition, and Indextype, sys and system have been authorized as OWNER to this realm.
    However, when I try and complete an Oracle Data Pump export operation on the schema, I get two errors directly after the following line displayed in the export log:
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
    ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 9081
    ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 9081
    The export is completed but with this errors.
    Any help, suggestions, pointers, etc actually anything will be very welcome at this stage.
    Thank you

    Hi Srini,
    Thank you very much for your help. Unfortunately after having followed the instructions of the DOC I am still getting the same errors ?
    none the less thank you for your input.
    I was also wondering if someone could tell me how to move this thread to the Database Security area of the forum, as I feel I may have posted the thread in the wrong place as it appears to be a Database Vault issue and not an imp/exp problem. ?
    Edited by: zooid on May 20, 2012 10:33 PM
    Edited by: zooid on May 20, 2012 10:36 PM

  • Has anyone had issues with Administration\Data Import/Export\Data Import???

    Has anyone had issues with Administration\Data Import/Export\Data Import???
    I have a client who has recently upgraded from V2007 to V8.81. They were succesfuly  using this standard function to import supplier prices to their master price list, but now it has failed?
    I have looked at the file they are importing and it appears to be fine.
    On closer inspection, it did contain approx 46,000 entries, so I took the first 1,000 and created a test file, which imported fine.
    The only issue I found was Speed, with the test file of 1,000 records taking about 30 Mins to import. This appeared to get slower and slower the further through the file it got!
    Based on this, I have estimated that the whole file would takle about 13 hours to import. The client say that when they used to run it on version 2007 it was far quicker?
    In practice, it does appear to run, but the speed is the issue. Having said this, I set the whole file to run last night (over night)and this morning it had appeared to hang after about 2,307 rows, with nothing else being updated.
    Has anyone any ideas or is aware of performance issues like this?
    Thanks,
    Ian

    Always an option, but would you give your clients access to this tool?
    Not sure really.
    I have uploaded a copy of their database onto my test system and run the same routine. Its equally as SLOW
    I can't gauage if its an issue with 8.81 that 2007 didn't have, as I only have the client's word on it, however I have no reason to disbelieve them.
    Kind regards,
    Ian

  • Exporting Data from one Server to Another server w/ Version Enabled Tables

    Hi,
    I'm currently having a problem with regards to Exporting data to another server. This is the Scenario:
    Source Server is Production Server with all of its Tables in the Schema are Version-Enabled.
    Destination Server is a Test Server.
    I exported data from Production Server using EXP command. Then in my Test Server I imported my data using IMP command (I already created tablespace and user for the Schema).
    Import is successful in my Test server but when I execute my queries, There are no rows returned.
    I checked my _LT tables and it contains my data. but when I query from the View created when version was enabled, no result is returned.
    Am I missing something when I exported and imported my Schema? Should I have included the WMSYS schema when I created the .dump file?
    Thanks in advance.

    Hi Stefan,
    we tried using Export and Import using Data Pump.
    expdp system/password@orcl full=y directory=dmpdir2 dumpfile=FULL_DB.dmp
    impdp system/password@orcl full=y table_exists_action=truncate directory=dmpdir2 dumpfile=FULL_DB.dmp
    Still the same result as using exp and imp. _LT tables have data but when you query using the View, no results are found.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • How to write export dump commad with no datable data only table structure.

    How to write export dump commad with no datable data only table structure will there and command for hole schma.
    e.g. export dump command for scott schema and all table within scott schema in it no table data should be exported.

    If I understand the question, it sounds like you just need to add the flag "ROWS=N" to your export command (I assume that you're talking about the old export utility, not the Data Pump version).
    Justin

  • Which version of the Forms Central subscription will allow me to export data to excel (CSV file)?

    Which version of the Forms Central subscription will allow me to export data to excel (CSV file)?

    Hi Heather 349
    I believe you can do it with both FormsCentral Basic and FormsCentral PLUS
    Also Refer : Tutorial: Exporting Responses to Excel, CSV or PDF

  • Export data with Dreamweaver (TXT, CSV)

    Hi to all of the Adobe community.
    My question to me is a little hard for my level of knowledge.
    I'm using Dreamweaver using technology PHP / MySQL, in my database have a table in MySQL called mailing list, with the fields name and email.
    I have a php page that I developed using the Dynamic Table where it returns me a list of names and e-mail registrations. What I would like to make is that at the end of this table I have a link or a button called export data. I wish that when you export this data be txt or csv format.
    How?
    Export data and further define the format to be exported.
    I'm using Dreamweaver CS4.
    Rodrigo Vieira da Silva Eufrasio
    E-mail: rodrigo.mct @ gmail.com
    Mobile: +55 11 8183-9484
    Brazil - Osasco - SP

    It is doable.
    Assume for the moment that you are not paging the query results (that is, you are display ALL results at one time).
    You would need to do the following when the button is clicked (the action calling a separate file for the processing):
    Open a file on the server for writing ($ofile = fopen("data.txt","w");)
    Repeat your query but instead of echo $row_Record set.... you would use fwrite($ofile, $row_Recordset...
    In between fields you would need fwrite($ofile, "\t") to put in a tab or fwrite($ofile, ",") for a CSV,
    At the end of every line you would need a <cr><lf>: fwrite ($ofile, "\r\n");
    Close the file - fclose ($ofile);
    Then use the header function to force the download of this file like this.
    header('Content-disposition: attachment; filename=data.txt');
    header('Content-type: text');
    readfile('data.txt);
    The header function has to be the first thing sent to the user. Any white space would cause it to fail. I haven't tried this part after a DB query, but it should work. If it doesn't you will have to invoke the header routine in a separate file.
    Hope this helps.
    Barry

  • Export - only tables with data

    Hai Everybody!
    I have a database in Oracle, which is having hundreds of tables. When i export the database, all the tables will be exported into the .dmp file.
    i want to export only the tables, which are having atleast one row to the .dmp file.
    Can anyone help me by providing the solution.
    Thanks
    JD

    JayaDev(JD) wrote:
    version 10gIn 11g, there is feature called "deferred segment creation". It means, if you create a new table and do not insert any data into it, Oracle will not create segment for that table, hence it will not be displayed in dba_segments.
    That is why, if you try export that table with exp utility, it will not be exported.

Maybe you are looking for