Export large schema witchout data

Hello,
I have large DB on 10g (10TB of data) and I have to create for test purposes copy of my db but wihtout data (only empty schema - a lot of tablespaces and users). How to archive that?
Regards

Datapump - parameter CONTENT, choose METADATA_ONLY:
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_export.htm#sthref90

Similar Messages

  • Error while exporting a schema using data pump

    Hi all,
    I have 11.1.0.7 database and am using expdp to export a schema. The schema is quite huge and has roughly about 4 GB of data. When i export using the following command,
    expdp owb_exp_v1/welcome directory=dmpdir dumpfile=owb_exp_v1.dmp
    i get the following error after running for around 1 hour.
    ORA-39126: Worker unexpected fatal error in KUPW$WORKER.UNLOAD_METADATA [TABLESPACE_QUOTA:"OWB_EXP_V1"]
    ORA-22813: operand value exceeds system limits
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 7839
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    4A974B9C 18237 package body SYS.KUPW$WORKER
    4A974B9C 7866 package body SYS.KUPW$WORKER
    4A974B9C 2744 package body SYS.KUPW$WORKER
    4A974B9C 8504 package body SYS.KUPW$WORKER
    4A961BF0 1 anonymous block
    4A9DAA4C 1575 package body SYS.DBMS_SQL
    4A974B9C 8342 package body SYS.KUPW$WORKER
    4A974B9C 1545 package body SYS.KUPW$WORKER
    4A8CD200 2 anonymous block
    Job "SYS"."SYS_EXPORT_SCHEMA_01" stopped due to fatal error at 14:01:23
    This owb_exp_v1 user has dba privileges. I am not sure what is causing this error. I have tried running it almost thrice but in vain. I also tried increasing the sort_area_size parameter. Even then, i get this error.
    Kindly help.
    Thanks,
    Vidhya

    Hi,
    Can you let us know what the last object type it was working on? It would be the line in the log file that looks like:
    Processing object type SCHEMA_EXPORT/...
    Thanks
    Dean

  • Error when exporting large amount of data to Excel from Apex4

    Hi,
    I'm trying to export over 30,000 lines of data from a report in Apex 4 to an Excel spreadsheet, this is not using a csv file.
    It appears to be working and then I get 'No Response from Application Web Server'. The report works fine when exporting smaller amounts of data.
    We have just upgraded the application to Apex 4 from Apex 3, where it worked without any problem.
    Has anyone else had this problem? We were wondering if it was a parameter in Apex4 that needs to be set.
    We are using Application Express 4.1.1.00.23 on Oracle 11g.
    Any help would be appreciated.
    Thanks
    Sue

    Hi,
    >
    I'm trying to export over 30,000 lines of data from a report in Apex 4 to an Excel spreadsheet, this is not using a csv file.
    >
    How? Application Builder > Data Workshop? Apex Page Process? (Packaged) procedure?
    >
    It appears to be working and then I get 'No Response from Application Web Server'. The report works fine when exporting smaller amounts of data.
    We have just upgraded the application to Apex 4 from Apex 3, where it worked without any problem.
    >
    Have you changed your webserver in the process? Say moved from OHS to ApexListener?
    >
    Has anyone else had this problem? We were wondering if it was a parameter in Apex4 that needs to be set.
    We are using Application Express 4.1.1.00.23 on Oracle 11g.
    Any help would be appreciated.

  • Export large ASCP plan data to excel

    Hi,
    We have a need to export plan results from ASCP Plan to excel. Export option takes too long. I would like to know if there is an alternate method in Oracle EBS R12.1 ASCP to get the results of Plan in excel or similar format.
    Thanks,
    Ash

    If you are exporting a substantial number of records to excel the best option id to use CSV format and then process the generated CSV file with excel.
    Exporting directly to excel format works well with small amounts of data, but with a large number of records system memory becomes an issue.
    If you feel the need to export directly into excel format you can try to increase the maximum memory allocated to SQLDeveloper by adding something like the following
    AddVMOption -Xmx1024Mto the sqldeveloper.conf file usually located in
    [SQLDEveloper_install_dir]\sqldeveloper\binadjust the number as you see fit, but bear in mind that there have been issues reported while meddling with this parameter.
    The default value for the parameter is stored in the ide.conf file usually located in
    [SQLDEveloper_install_dir]\ide\bin

  • Exporting large amount of data

    I need help.
    I have to export the results of a query (can be many, up to a million records) into a CSV file.
    Is it preferable to create a temporary file on the server and then serve it to the client or
    is it better to directly create the response while browsing the recordset?
    Thanks
    Marco
    p.s.: sorry for my poor english.

    Even I am looking for same, I want to create csv file inside the servlet , and sent the csv file to client using download
    option.
    Regard's
    Suresh Babu G

  • Error while Exporting large data from Reportviewer on azure hosted website.

    Hi,
    I have website hosted on azure. I used SSRS reportviewer control to showcase my reports. while doing so i faced an issue.
    Whenever i export large amount of data as Excel/PDF/word/tiff it abruptly throw following error:
    Error: Microsoft.Reporting.WebForms.ReportServerException: The remote server returned an error: (401) Unauthorized. ---> System.Net.WebException: The remote server returned an error: (401) Unauthorized.
    at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult)
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
    --- End of inner exception stack trace ---
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.ServerUrlRequest(AbortState abortState, String url, Stream outputStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.SoapReportExecutionService.Render(AbortState abortState, String reportPath, String executionId, String historyId, String format, XmlNodeList deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerReport.InternalRender(Boolean isAbortable, String format, String deviceInfo, NameValueCollection urlAccessParameters, Stream reportStream, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerReport.Render(String format, String deviceInfo, NameValueCollection urlAccessParameters, String& mimeType, String& fileNameExtension)
    at Microsoft.Reporting.WebForms.ServerModeSession.RenderReport(String format, Boolean allowInternalRenderers, String deviceInfo, NameValueCollection additionalParams, Boolean cacheSecondaryStreamsForHtml, String& mimeType, String& fileExtension)
    at Microsoft.Reporting.WebForms.ExportOperation.PerformOperation(NameValueCollection urlQuery, HttpResponse response)
    at Microsoft.Reporting.WebForms.HttpHandler.ProcessRequest(HttpContext context)
    at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
    at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)
    It works locally (developer machine) or having less data. But it didn't work with large data when publish on azure.
    Any help will be appreciated.
    Thanks.

    Sorry, let me clarify my questions as they were ambiguous:
    For a given set if input, does the request always take the same amount of time to fail? How long does it take?
    When it works (e.g. on local machine using same input), how big is the output file that gets downloaded?
    Also, if you can share your site name (directly or
    indirectly), and the UTC time where you made an attempt, we may be able to get more info on our side.

  • Transporting large amounts of data from one database schema to another

    Hi,
    We need to move large amount of data from one 10.2.0.4 database schema to another 11.2.0.3 database.
    Am currently using datapump but quite slow still - having to do in chunks.
    Also the datapump files quite large so having to compress and move across the network.
    Is there a better/quicker way?
    Habe haerd about transportable tablespaces but never used them and don't know about speed - if quicker thana datapump.
    tablespace names different in both databases.
    Also source database on solaris opertaing system on sun box
    target database on aix on ibm power series box.
    Any ideas would be great.
    Thanks
    Edited by: user5716448 on 08-Sep-2012 03:30
    Edited by: user5716448 on 08-Sep-2012 03:31

    user5716448 wrote:
    Hi,
    We need to move large amount of data from one 10.2.0.4 database schema to another 11.2.0.3 database.
    Pl quantify "large".
    Am currently using datapump but quite slow still - having to do in chunks.
    Pl quantify "quite slow".
    Also the datapump files quite large so having to compress and move across the network.
    Again, pl quantify "quite large".
    Is there a better/quicker way?
    Habe haerd about transportable tablespaces but never used them and don't know about speed - if quicker thana datapump.
    tablespace names different in both databases.
    Also source database on solaris opertaing system on sun box
    target database on aix on ibm power series box.
    It may be possible, assuming you do not violate any of these conditions
    http://docs.oracle.com/cd/E11882_01/server.112/e25494/tspaces013.htm#ADMIN11396
    Any ideas would be great.
    Thanks
    Edited by: user5716448 on 08-Sep-2012 03:30
    Edited by: user5716448 on 08-Sep-2012 03:31Master Note for Transportable Tablespaces (TTS) -- Common Questions and Issues [ID 1166564.1]
    HTH
    Srini

  • Export oracle database schema without data

    Hi All,
    cany any one tell me how to export oracle database schema without data using exp command not datapump command.

    step 1
    type exp help=y
    step 2
    read the output
    step 3 now run exp ... rows=n
    Life can be so easy when you aren't lazy. You don't waste time asking to be spoon fed by others if you can do something yourself in 2 seconds.
    Sybrand Bakker
    Senior Oracle DBA

  • Get export of cpce schema without data

    exp system/***** file=cpceschema.dmp log=cpceschema.log rows=n full=n owner=cpce
    Is this the right command, for getting export of cpce schema without data, i took export and Export terminated successfully with warnings, need suggestions.

    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, Real Application Clusters, OLAP and Data Mining options
    Export done in WE8ISO8859P15 character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P1 character set (possible charset conversion)
    Note: table data (rows) will not be exported
    About to export specified users ...
    . exporting pre-schema procedural objects and actions
    . exporting foreign function library names for user CPCE
    . exporting PUBLIC type synonyms
    . exporting private type synonyms
    . exporting object type definitions for user CPCE
    About to export CPCE's objects ...
    . exporting database links
    . exporting sequence numbers
    . exporting cluster definitions
    . about to export CPCE's tables via Conventional Path ...
    . . exporting table COLLECTEDDATA
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table CONFIGURATION
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table DCCONTEXT
    . . exporting table DCPLAN
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table EQUIPMENT
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table FDCCONTEXT
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table GPCDATA
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table INDICATOR
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table JOBS_TBL
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table MAINTENANCEERROR
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table MAINTENANCELOG
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table MESCONTEXT
    . . exporting table MODULE
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table PARTITIONMETHOD1
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table PARTITIONMETHOD2
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table RECIPE
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table REQUESTS_TBL
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table STRATEGY
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table TREATEDDATA
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table VARIABLE
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . exporting synonyms
    . exporting views
    . exporting stored procedures
    . exporting operators
    . exporting referential integrity constraints
    . exporting triggers
    . exporting indextypes
    . exporting bitmap, functional and extensible indexes
    . exporting posttables actions
    . exporting materialized views
    . exporting snapshot logs
    . exporting job queues
    . exporting refresh groups and children
    . exporting dimensions
    . exporting post-schema procedural objects and actions
    . exporting statistics
    Export terminated successfully with warnings.

  • Large Schema file registered and using folders to load the data (WebDAV)

    I have a very large schema file that I have registered using binary (Schema file > 30000 lines).
    My data has been in MarkLogic and it is organized in a folder structure providing reuse of pieces of the xml.
    It seemed the most logical way to load the data was to create the folder structure in XDB and move the files via WebDAV. We also have extensive XQueries written.
    My question is how do I query the data back our when it is loaded this way. I have read and experimented with resource_view but that is not going to get me what I need. I would like to make use of my xqueries that I have.
    Can i do this, and if so, HOW??

    Sure. Let me lay out a with some more specific information.
    My schema has an overall root level of "Machine Makeup".
    All of these items are defined under it with a lot of element reuse and tons of attribute groups that are used throughout the xml schema. I can do a lot of things but what I cannot do is change the structure of the schema.
    The data is currently in a "folder structure" that more closely resembles the following. I have tried to annotate number of files and keep in mind all of these are working documents and additional "record" (xml files) can be added.
    Composite - contains 12 folders and each of these folders contain xml documents
    compfolder 1
    - compfolder 12
    Most of these folders contain < 200 .xml files (each with id.xml) as the name of the file however one of these directories currently contains 1546 files. They all belong... no real way to split them up further.
    At the same level as composite
    About half of these folders at this level contain > 1000 but less than 3000.
    Like
    PartsUse > 1000 but less than 3000
    transfer of parts 8000 and can continue to grow
    people > 3000 and these would be used in the transfer of parts like "who sold it" Everytime someone new a new one is added etc.
    There are about 12 folders at this level.
    Now, the way the system works is a new person that is not in our list is involved the users get a "new" empty xml document to fill in and we assign it an id and place it in the folder.
    So it would something like this
    Composite
    folder1 - 1000 xml file
    folder 2 - 200 xml files
    folder 12 - < 2000
    Locations
    contains < 2000 xml files
    Activities < 3000 xml files
    PartsTransfer 8000 xml files and growing
    materials < 1000 xml files
    setup < 1000 xml files
    people 3000 xml files and growing
    groupUse < 1000
    and so forth.
    All of the folders contain the links I had previously layed out
    So Activities would have materialLink id = "333" peopleLink ="666"
    And so of and so forth.
    So because my file numbers are > than the optimum about half the time, how would I have 3 separate xmltype tables (I understand mine would be more than 3...)the one schema file that is intertwined with others. Can you show me an example. Would I annotate it in the schema at just those levels. This schema file is so huge I would not want to have to annotate all elements with a table. Can I pick and choose to mimic the folder structure?
    THANKS SO MUCH for your help. I really want to show that Oracle is the way. I can implement it with a simpler structure but not the size of complexity that I need. I haven't been able to find a lot of examples with the new binary registration.
    Edited by: ferrarapayne on Nov 16, 2011 1:00 PM

  • Export schema with data

    Good Morning All,
    I would like to export schema with data to another.
    Any help?
    Thanks in advance,
    EB NY

    If you have a few tables...quick and dirty way is to
    create new_table as select * from old_schema.tableOtherwise depending on what version you are on and you have access, use exp/data pump.
    Alot of good pages on using expdp but the general syntax is
    1st, make the directories
    connect / as sysdba
    grant create any directory to org_schema_user;
    create or replace directory dump_directory as '/my_directory_on_the_file_system/';
    grant read, write on directory dump_directory to org_schema_user;
    grant read, write on directory dump_directory to new_schema_user;export
    expdp user/password schemas=start_schema directory=dump_directory dumpfile=my_dmp_file.dmp logfile=my_dmp_file_export.logimport
    impdp new_user/password remap_schema=old_schema:new_schema directory=dump_directory dumpfile=my_dmp_file.dmp logfile=my_dmp_file_import.log

  • Cmd to export and inport table data of a paticular schema

    hi,
    can anyone tell me the cmd to export and inport table data of a paticular schema

    Exp userid=system/pass owner=SCOTT rows=Y triggers=Y
    Imp userid=system/pass file=expdat.dmp fromuser=SCOTT touser=NEW commit=yes log=log.txt

  • How to export message body and data from Table to Excel from outlook 2010

    I usually get Employee announcement in emails and I need to compile excel sheet from all these emails to know change in status of employee from previous line to current line .
    Dear Concerned,
    The change in status of the following employee has been carried out as per following details:
    New Status
    Change in Job
    Effective Date
    01-Feb-2015
    Employee Name
    Ricky ponting
    Employee Code
    4982
    Designation
    Sourcing Executive (Secondment)
    Job Group
    1A
    Department
    Sourcing & Supply Chain
    Unit
    Technology Sourcing
    Division
    Finance
    Location
    sydney
    Reporting Line
    Mr Micheal king
    Note: Ricky Ponting  was previously working as
    Tariff Implementation Support Officer XYZ organization was reporting to
    Mr Robin Sing
    I need working code that export about HTML table data as well last Note : full line so that I can have an excel file of 2000 Employees whoes status have been changed and I can easily sort out from which previous line they were reporting to new line and I
    can get in touch with the new line for any Access rights re-authorization exercise on later stage .
    Currently i am using following code thats working fine with the table extraction but NOTE: line is not being fetched with the following code based on following URL
    https://techniclee.wordpress.com/2011/10/29/exporting-outlook-messages-to-excel/
    Const MACRO_NAME = "Export Messages to Excel (Rev Sajjad)"
    Private Sub ExportMessagesToExcel()
        Dim olkFld As Outlook.MAPIFolder, _
            olkMsg As Outlook.MailItem, _
            excApp As Object, _
            excWkb As Object, _
            excWks As Object, _
            arrCel As Variant, _
            varCel As Variant, _
            lngRow As Long, _
            intPtr As Integer, _
            intVer As Integer
        Set olkFld = Session.PickFolder
        If TypeName(olkFld) = "Nothing" Then
            MsgBox "You did not select a folder.  Operation cancelled.", vbCritical + vbOKOnly, MACRO_NAME
        Else
            intVer = GetOutlookVersion()
            Set excApp = CreateObject("Excel.Application")
            Set excWkb = excApp.Workbooks.Add
            Set excWks = excWkb.Worksheets(1)
            excApp.Visible = True
            With excWks
                .Cells(1, 1) = "Subject"
                .Cells(1, 2) = "Received"
                .Cells(1, 3) = "Sender"
                .Cells(1, 4) = "New Status"
                .Cells(1, 5) = "Effective Date"
                .Cells(1, 6) = "Employee Name"
                .Cells(1, 7) = "Employee Code"
                .Cells(1, 8) = "Designation"
                .Cells(1, 9) = "Job Group"
                .Cells(1, 10) = "Department"
                .Cells(1, 11) = "Unit"
                .Cells(1, 12) = "Division"
                .Cells(1, 13) = "Location"
                .Cells(1, 14) = "Reporting Line"
                .Cells(1, 15) = "Note:"
            End With
            lngRow = 2
            For Each olkMsg In olkFld.Items
                excWks.Cells(lngRow, 1) = olkMsg.Subject
                excWks.Cells(lngRow, 2) = olkMsg.ReceivedTime
                excWks.Cells(lngRow, 3) = GetSMTPAddress(olkMsg, intVer)
               For intPtr = LBound(arrCel) To UBound(arrCel)
                    Select Case Trim(arrCel(intPtr))
                        Case "New Status"
                            excWks.Cells(lngRow, 4) = arrCel(intPtr + 1)
                        Case "Effective Date"
                            excWks.Cells(lngRow, 5) = arrCel(intPtr + 1)
                        Case "Employee Name"
                            excWks.Cells(lngRow, 6) = arrCel(intPtr + 1)
                        Case "Employee Code"
                            excWks.Cells(lngRow, 7) = arrCel(intPtr + 1)
                        Case "Designation"
                            excWks.Cells(lngRow, 8) = arrCel(intPtr + 1)
                        Case "Job Group"
                            excWks.Cells(lngRow, 9) = arrCel(intPtr + 1)
                        Case "Department"
                            excWks.Cells(lngRow, 10) = arrCel(intPtr + 1)
                        Case "Unit"
                            excWks.Cells(lngRow, 11) = arrCel(intPtr + 1)
                        Case "Division"
                            excWks.Cells(lngRow, 12) = arrCel(intPtr + 1)
                        Case "Location"
                            excWks.Cells(lngRow, 13) = arrCel(intPtr + 1)
                        Case "Reporting Line"
                            excWks.Cells(lngRow, 14) = arrCel(intPtr + 1)
                        Case "Note:"
                            excWks.Cells(lngRow, 14) = arrCel(intPtr + 1)
                        End Select
                Next
                lngRow = lngRow + 1
            Next
            excWks.Columns("A:W").AutoFit
            excApp.Visible = True
            Set excWks = Nothing
            Set excWkb = Nothing
            Set excApp = Nothing
        End If
        Set olkFld = Nothing
    End Sub
    Private Function GetSMTPAddress(Item As Outlook.MailItem, intOutlookVersion As Integer) As String
        Dim olkSnd As Outlook.AddressEntry, olkEnt As Object
        On Error Resume Next
        Select Case intOutlookVersion
            Case Is < 14
                If Item.SenderEmailType = "EX" Then
                    GetSMTPAddress = SMTP2007(Item)
                Else
                    GetSMTPAddress = Item.SenderEmailAddress
                End If
            Case Else
                Set olkSnd = Item.Sender
                If olkSnd.AddressEntryUserType = olExchangeUserAddressEntry Then
                    Set olkEnt = olkSnd.GetExchangeUser
                    GetSMTPAddress = olkEnt.PrimarySmtpAddress
                Else
                    GetSMTPAddress = Item.SenderEmailAddress
                End If
        End Select
        On Error GoTo 0
        Set olkPrp = Nothing
        Set olkSnd = Nothing
        Set olkEnt = Nothing
    End Function
    Function GetOutlookVersion() As Integer
        Dim arrVer As Variant
        arrVer = Split(Outlook.Version, ".")
        GetOutlookVersion = arrVer(0)
    End Function
    Function SMTP2007(olkMsg As Outlook.MailItem) As String
        Dim olkPA As Outlook.PropertyAccessor
        On Error Resume Next
        Set olkPA = olkMsg.PropertyAccessor
        SMTP2007 = olkPA.GetProperty("http://schemas.microsoft.com/mapi/proptag/0x5D01001E")
        On Error GoTo 0
        Set olkPA = Nothing
    End Function
    Sub DebugLabels()
        Dim olkMsg As Outlook.MailItem, objFSO As Object, objFil As Object, strBuf As String, strPth As String, arrCel As Variant, intPtr As Integer
        strPth = Environ("USERPROFILE") & "\Documents\Debugging.txt"
        Set olkMsg = Application.ActiveExplorer.Selection(1)
        arrCel = Split(GetCells(olkMsg.HTMLBody), Chr(255))
        For intPtr = LBound(arrCel) To UBound(arrCel)
            strBuf = strBuf & StrZero(intPtr, 2) & vbTab & "*" & arrCel(intPtr) & "*" & vbCrLf
        Next
        Set objFSO = CreateObject("Scripting.FileSystemObject")
        Set objFil = objFSO.CreateTextFile(strPth)
        objFil.Write strBuf
        objFil.Close
        Set olkMsg = Application.CreateItem(olMailItem)
        With olkMsg
            .Recipients.Add "[email protected]"
            .Subject = "Debugging Info"
            .BodyFormat = olFormatPlain
            .Body = "The debugging info for the selected message is attached.  Please click Send to send this message to David."
            .Attachments.Add strPth
            .Display
        End With
        Set olkMsg = Nothing
        Set objFSO = Nothing
        Set objFil = Nothing
    End Sub
    Function StrZero(varNumber, intLength)
        Dim intItemLength
        If IsNumeric(varNumber) Then
            intItemLength = Len(CStr(Int(varNumber)))
            If intItemLength < intLength Then
                StrZero = String(intLength - intItemLength, "0") & varNumber
            Else
                StrZero = varNumber
            End If
        Else
            StrZero = varNumber
        End If
    End Function

    Dear Graham
    I am already big fan of yours and using mail to many Addin from years from word 2007 to Word 2010 :) and still loving it and I use it for access re-authorization from Lines for application accesses . I tried and finally got understanding of the Extract to
    mail Addin and after tweaking excel - Text To columns and other few things finally i was able to get the required data - from morning to now :) I am happy to see your provided guidance
    Thanks alot - by the way why your Mail to many add-in is so slow now these days :) previous versions usually help me send 1000 emails in 10 minutes now it takes long time :)

  • Dealing with large volumes of data

    Background:
    I recently "inherited" support for our company's "data mining" group, which amounts to a number of semi-technical people who have received introductory level training in writing SQL queries and been turned loose with SQL Server Management
    Studio to develop and run queries to "mine" several databases that have been created for their use.  The database design (if you can call it that) is absolutely horrible.  All of the data, which we receive at defined intervals from our
    clients, is typically dumped into a single table consisting of 200+ varchar(x) fields.  There are no indexes or primary keys on the tables in these databases, and the tables in each database contain several hundred million rows (for example one table
    contains 650 million rows of data and takes up a little over 1 TB of disk space, and we receive weekly feeds from our client which adds another 300,000 rows of data).
    Needless to say, query performance is terrible, since every query ends up being a table scan of 650 million rows of data.  I have been asked to "fix" the problems.
    My experience is primarily in applications development.  I know enough about SQL Server to perform some basic performance tuning and write reasonably efficient queries; however, I'm not accustomed to having to completely overhaul such a poor design
    with such a large volume of data.  We have already tried to add an identity column and set it up as a primary key, but the server ran out of disk space while trying to implement the change.
    I'm looking for any recommendations on how best to implement changes to the table(s) housing such a large volume of data.  In the short term, I'm going to need to be able to perform a certain amount of data analysis so I can determine the proper data
    types for fields (and whether any existing data would cause a problem when trying to convert the data to the new data type), so I'll need to know what can be done to make it possible to perform such analysis without the process consuming entire days to analyze
    the data in one or two fields.
    I'm looking for reference materials / information on how to deal with the issues, particularly when a large volumn of data is involved.  I'm also looking for information on how to load large volumes of data to the database (current processing of a typical
    data file takes 10-12 hours to load 300,000 records).  Any guidance that can be provided is appreciated.  If more specific information is needed, I'll be happy to try to answer any questions you might have about my situation.

    I don't think you will find a single magic bullet to solve all the issues.  The main point is that there will be no shortcut for major schema and index changes.  You will need at least 120% free space to create a clustered index and facilitate
    major schema changes.
    I suggest an incremental approach to address you biggest pain points.  You mention it takes 10-12 hours to load 300,000 rows, which suggests there may be queries involved in the process which require full scans of the 650 million row table.  Perhaps
    some indexes targeted at improving that process is a good first step.
    What SQL Server version and edition are you using?  You'll have more options with Enterprise (partitioning, row/page compression). 
    Regarding the data types, I would take a best guess at the proper types and run a query with TRY_CONVERT (assuming SQL 2012) to determine counts of rows that conform or not for each column.  Then create a new table (using SELECT INTO) that has strongly
    typed columns for those columns that are not problematic, plus the others that cannot easily be converted, and then drop the old table and rename the new one.  You can follow up later to address columns data corrections and/or transformations. 
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • SOS!! How to export a schema while the SYSAUX tablespace is offline?

    Hello,
    the SYSAUX for one of our productive database became corrupted and I had to take it offline in order to open the database. There is no way to recover it since there are missing archive logs from the sequence.
    So I need to export a database schema which holds all the productive tables/indexes (there are no procedure, functions, packages, triggers,etc - just tables, constraints and indexes).
    When I try to export the schema, I get the following error:
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export done in UTF8 character set and UTF8 NCHAR character set
    . exporting pre-schema procedural objects and actions
    . exporting foreign function library names for user CMS31_ARMENTEL_SITE_N
    . exporting PUBLIC type synonyms
    . exporting private type synonyms
    . exporting object type definitions for user CMS31_ARMENTEL_SITE_N
    About to export CMS31_ARMENTEL_SITE_N's objects ...
    . exporting database links
    . exporting sequence numbers
    . exporting cluster definitions
    EXP-00056: ORACLE error 376 encountered
    ORA-00376: file 3 cannot be read at this time
    ORA-01110: data file 3: '/opt/data1/oradata/UTF8/sysaux01.dbf'
    ORA-06512: at "SYS.DBMS_METADATA", line 1511
    ORA-06512: at "SYS.DBMS_METADATA", line 1548
    ORA-06512: at "SYS.DBMS_METADATA", line 1864
    ORA-06512: at "SYS.DBMS_METADATA", line 3707
    ORA-06512: at "SYS.DBMS_METADATA", line 3689
    ORA-06512: at line 1
    EXP-00000: Export terminated unsuccessfully
    I've tried direct export, data pump (expdp) but with the same unsucessfull results.
    Is there any way to use the export or expdp utility to get the actual data from a specific schema ?
    Many thanks

    Hello Harry and many thanks for your reply.
    EXCLUDE is an option of expdp not exp. So I used expdp and get the following error:
    expdp impexp/blabla dumpfile=SITE_N.dmp schemas=SITE_N exclude=CLUSTER:SITE_N
    Export: Release 10.1.0.2.0 - Production on Tuesday, 29 January, 2008 20:56
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
    With the Partitioning, OLAP and Data Mining options
    ORA-31626: job does not exist
    ORA-31638: cannot attach to job SYS_EXPORT_SCHEMA_05 for user IMPEXP
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 377
    ORA-39077: unable to subscribe agent KUPC$A_1_20080129205606 to queue "KUPC$C_1_20080129205606"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPC$QUE_INT", line 248
    ORA-25448: rule SYS.KUPC$C_1_20080129205606$227 has errors
    ORA-00376: file 3 cannot be read at this time
    ORA-01110: data file 3: '/opt/data1/oradata/UTF8/sysaux01.dbf'
    Different error (since now I use expdp) but for the same reason (SYSAUX is offline)
    Regards
    Argyris

Maybe you are looking for

  • Ipod nano_folder with exclamation

    Hello: I've bought a second ipod nano. My sister tried to charged it and to register it without installing the ipod software. Now I see a folder with an exclamation and can not enter into the menu or anything. I've tried to unistall the ipod software

  • HT4623 how to update my iphone version?

    i have iphone with 4.2.1 version how to i will update it into 4.3 or 5.11.?? please help me to update it..plssss

  • VF01 user exit  clarification

    hi, i got a requirement to validate the sales organisation and billing type to create the billing document. kindly provide which user exit is used for the above condition thanks in advance

  • BADI for Scheduling Agreement

    I have created the BADI ZME_PROCESS_PO_CUST this badi is executed for normal PO's like thru ME21N but when i make the Scheduling agreement thru ME31L the this BADI is not executed is there any other BADI for Scheduling agreement. if so then help me o

  • Facebook notification gone after google mobile installed

    My facebook notifies me before i installed the google mobile..But now since gmail mobile installed im not getting any notification from FB it goes to my gmail to notify me which i dont like even I turned off email notification..I checked all the sett