ASCII Dump using SQL

Dear All,
Could any one help me in designing the efficient sql query which should be storage effcient. I run the following query on the table:
SET NEWPAGE 0
SET SPACE 0
SET LINESIZE 319
SET TERMOUT OFF
SET PAGESIZE 0
SET ECHO OFF
SET FEEDBACK OFF
SET HEADING OFF
SET COLSEP ','
SET MARKUP HTML OFF SPOOL OFF
SPOOL FileName
alter session set nls_date_format = 'dd-mm-yyyy hh24:mi:ss';
SELECT *
FROM TABLE1
WHERE deltime > trunc(sysdate)-3
AND deltime < trunc(sysdate)-2;
SPOOL OFF
Table structrue is as follows:
Name Null? Type
DESTINATION VARCHAR2(20)
SOURCE VARCHAR2(20)
PRIORITY NUMBER(1)
PROTOCOLID VARCHAR2(20)
DCS VARCHAR2(20)
DELTIME DATE
EXPTIME DATE
MESSAGE VARCHAR2(4000)
STATUS VARCHAR2(30)
PUSHROWID VARCHAR2(50)
ID NOT NULL NUMBER(38)
MSGTYPE VARCHAR2(30)
MSISDN VARCHAR2(20)
CUST VARCHAR2(10)
What is happening a big fie size is creating by the above quey, reason is that if MESSAGE column is declared as VARCHAR2(4000) in size it is reserving the 4000 spaces in the dump file regardless the message size in the table. I want this query should ve optimised so that what ever the message size is there it should dump that in the spooled file and if there is empty coulmn it should place null in that coulmn.
Any help please?

maybe that helps
SCOTT@ORCL-30/08/05 >select ltrim(' test') from dual;
LTRI
test
SCOTT@ORCL-30/08/05 >select lpad('test',20,' ' ) from dual;
LPAD('TEST',20,'')
test
SCOTT@ORCL-30/08/05 >select length(' test') from dual;
LENGTH('TEST')
22
SCOTT@ORCL-30/08/05 >select ltrim(' test') from dual;
LTRI
test
SCOTT@ORCL-30/08/05 >select length( ltrim(' test')) from dual;
LENGTH(LTRIM('TEST'))
4
and use the col message to suite your demands as you like

Similar Messages

  • Data Import from Oracle Dump to SQL Server(2000/2005)

    Hi Friends,
    I'm using Oracle 10g.
    I've a scenario.
    I've Oracle Dump file. Now i would like import in SQL Server(2000/2005).
    Can any one suggest me how to do this?
    Any idea,clue will be highly appreciated.

    I think you will need to import the data into SQL Server from an ASCII flat file using SQL Server DTS if avialable. You can use SQL Plus to extract the data outof Oracle into comma delimited flat files and then use SQL Server DTS load into the SQL Server database.

  • Load XML File into temporary tables using sql loader

    Hi All,
    I have an XML file as below. I need to insert the contents into a temporary staging table using sql loader. Please advice how I need to do that.
    For example Portfolios should go into a seperate table, and all the tags inside it should be populated in the columns of the table.
    Family should go into a seperate table and all the tags inside it should be populated in the columns of the table.
    Similarly offer, Products etc.
    - <ABSProductCatalog xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    - <ProductSalesHierachy>
    - <Portfolios>
    - <Portfolio productCode="P1">
      <Attribute name="CatalogProductName" value="Access" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
    - <Portfolio productCode="P2">
      <Attribute name="CatalogProductName" value="Data" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
    - <Portfolio productCode="P3">
      <Attribute name="CatalogProductName" value="Voice" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
    - <Portfolio productCode="P4">
      <Attribute name="CatalogProductName" value="Wireless" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
      </Portfolios>
    - <Families>
    - <Family productCode="F1">
      <Attribute name="CatalogProductName" value="Internet Access Services" />
      <Attribute name="Status" value="Active" />
    - <ParentHierarchy>
      <Item productCode="P1" modelType="Portfolio" />
      </ParentHierarchy>
      </Family>
    - <Family productCode="F2">
      <Attribute name="CatalogProductName" value="Local Access Services" />
      <Attribute name="Status" value="Active" />
    - <ParentHierarchy>
      <Item productCode="P2" modelType="Portfolio" />
      </ParentHierarchy>
      </Family>
      </Families>
    - <SubFamilies>
    - <SubFamily productCode="SF1">
      <Attribute name="CatalogProductName" value="Business Internet service" />
      <Attribute name="Status" value="Active" />
    - <ParentHierarchy>
      <Item productCode="F1" modelType="Family" />
      </ParentHierarchy>
      </SubFamily>
      </SubFamilies>
    - <ProductRefs>
    - <ProductRef productCode="WSP1" modelType="Wireline Sales Product">
      <ActiveFlag>Y</ActiveFlag>
    - <ProductHierarchy>
      <SalesHierarchy family="F1" subFamily="SF1" portfolio="P1" primary="Y" />
      <SalesHierarchy family="F2" portfolio="P2" primary="N" />
      <FinancialHierarchy quotaBucket="Voice" strategicProdCategory="Local Voice" />
      </ProductHierarchy>
      </ProductRef>
    - <ProductRef productCode="MSP2" modelType="Handset">
      <ActiveFlag>Y</ActiveFlag>
    - <ProductHierarchy>
      <SalesHierarchy portfolio="P4" primary="Y" />
      </ProductHierarchy>
      </ProductRef>
      </ProductRefs>
      </ProductSalesHierachy>
    - <Offers>
    - <Offer productCode="ABN">
      <OfferName>ABN</OfferName>
      <OfferDescription>ABN Description</OfferDescription>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      </Segments>
      <OfferUpdateDate>2009-11-20</OfferUpdateDate>
      <ActiveFlag>Y</ActiveFlag>
      </Offer>
    - <Offer productCode="OneNet">
      <OfferName>OneNet</OfferName>
      <OfferDescription>OneNet Description</OfferDescription>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      <Segment>PCG2</Segment>
      </Segments>
      <OfferUpdateDate>2009-11-20</OfferUpdateDate>
      <ActiveFlag>Y</ActiveFlag>
      </Offer>
      </Offers>
    - <Products>
    - <Product productCode="WSP1" modelType="Wireline Sales Product">
      <ProductName>AT&T High Speed Internet</ProductName>
      <ProductDescription>High Speed Internet</ProductDescription>
      <LegacyCoProdIndicator>SBC</LegacyCoProdIndicator>
      <RevenueCBLCode>1234B</RevenueCBLCode>
      <VolumeCBLCode>4567A</VolumeCBLCode>
      <SAARTServiceIDCode>S1234</SAARTServiceIDCode>
      <MarginPercentRequired>Y</MarginPercentRequired>
      <PercentIntl>%234</PercentIntl>
      <UOM>Each</UOM>
      <PriceType>OneTime</PriceType>
      <ProductStatus>Active</ProductStatus>
      <Compensable>Y</Compensable>
      <Jurisdiction>Everywhere</Jurisdiction>
      <ActiveFlag>Y</ActiveFlag>
    - <Availabilities>
      <Availability>SE</Availability>
      <Availability>E</Availability>
      </Availabilities>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      </Segments>
      <VDIndicator>Voice</VDIndicator>
      <PSOCCode>PSOC 1</PSOCCode>
      <USBilled>Y</USBilled>
      <MOWBilled>N</MOWBilled>
      <ProductStartDate>2009-11-20</ProductStartDate>
      <ProductUpdateDate>2009-11-20</ProductUpdateDate>
      <ProductEndDate>2010-11-20</ProductEndDate>
    - <AliasNames>
      <AliasName>AT&T HSI</AliasName>
      <AliasName>AT&T Fast Internet</AliasName>
      </AliasNames>
    - <OfferTypes>
      <OfferType productCode="ABN" endDate="2009-11-20" />
      <OfferType productCode="OneNet" />
      </OfferTypes>
    - <DynamicAttributes>
    - <DynamicAttribute dataType="String" defaultValue="2.5 Mbps" name="Speed">
      <AttrValue>1.5 Mbps</AttrValue>
      <AttrValue>2.5 Mbps</AttrValue>
      <AttrValue>3.5 Mbps</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="TransportType">
      <AttrValue>T1</AttrValue>
      </DynamicAttribute>
      </DynamicAttributes>
      </Product>
    - <Product productCode="MSP2" modelType="Handset">
      <ProductName>Blackberry Bold</ProductName>
      <ProductDescription>Blackberry Bold Phone</ProductDescription>
      <LegacyCoProdIndicator />
      <RevenueCBLCode />
      <VolumeCBLCode />
      <SAARTServiceIDCode />
      <MarginPercentRequired />
      <PercentIntl />
      <UOM>Each</UOM>
      <PriceType />
      <ProductStatus>Active</ProductStatus>
      <Compensable />
      <Jurisdiction />
      <ActiveFlag>Y</ActiveFlag>
    - <Availabilities>
      <Availability />
      </Availabilities>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      </Segments>
      <VDIndicator>Voice</VDIndicator>
      <PSOCCode />
      <USBilled />
      <MOWBilled />
      <ProductStartDate>2009-11-20</ProductStartDate>
      <ProductUpdateDate>2009-11-20</ProductUpdateDate>
    - <AliasNames>
      <AliasName />
      </AliasNames>
    - <OfferTypes>
      <OfferType productCode="ABN" />
      </OfferTypes>
    - <DynamicAttributes>
    - <DynamicAttribute dataType="String" name="StlmntContractType">
      <AttrValue />
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="BMG 2 year price">
      <AttrValue>20</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="MSRP">
      <AttrValue>40</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="BMGAvailableType">
      <AttrValue />
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="ProductId">
      <AttrValue>123456</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="modelSource">
      <AttrValue>product</AttrValue>
      </DynamicAttribute>
      </DynamicAttributes>
      </Product>
      </Products>
      <CatalogChanged>Y</CatalogChanged>
      </ABSProductCatalog>

    Two options that come to mind. Others exist.
    #1 - {thread:id=474031}, which is basically storing the XML in an Object Relational structure for parsing
    #2 - Dump the XML into either an XMLType based table or column and use SQL (with XMLTable) to create a view that parses the data. This would be the same as the view shown in the above post.
    Don't use sql*loader to parse the XML. I was trying to find a post from mdrake about that but couldn't. In short, sql*loader was not build as an XML parser so don't try to use it that way.

  • Error while taking dump using datapump

    getting following error -
    Export: Release 10.2.0.1.0 - Production on Friday, 15 September, 2006 10:31:41
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Starting "XX"."SYS_EXPORT_SCHEMA_02": XX/********@XXX directory=dpdump dumpfile=XXX150906.dmp logfile=XXX150906.log
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
    ORA-31642: the following SQL statement fails:
    BEGIN "DMSYS"."DBMS_DM_MODEL_EXP".SCHEMA_CALLOUT(:1,0,0,'10.02.00.01.00'); END;
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
    ORA-06512: at "SYS.DBMS_METADATA", line 907
    ORA-06550: line 1, column 7:
    PLS-00201: identifier 'DMSYS.DBMS_DM_MODEL_EXP' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 6235
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    2A68E610 14916 package body SYS.KUPW$WORKER
    2A68E610 6300 package body SYS.KUPW$WORKER
    2A68E610 9120 package body SYS.KUPW$WORKER
    2A68E610 1880 package body SYS.KUPW$WORKER
    2A68E610 6861 package body SYS.KUPW$WORKER
    2A68E610 1262 package body SYS.KUPW$WORKER
    255541A8 2 anonymous block
    Job "XX"."SYS_EXPORT_SCHEMA_02" stopped due to fatal error at 10:33:12
    Action required is contact customer support. And on metalink found a link that states it a bug in 10g release 1 that was suppose to be fixed in 10g release 1 version 4.
    some of the default schemas were purposely dropped from the database. The only default schema available now are -
    DBSNMP, DIP, OUTLN, PUBLIC, SCOTT, SYS, SYSMAN, SYSTEM, TSMSYS.
    DIP, OUTLN, TSMSYS were created again.
    Could this be a cause of problem??
    Thanks in adv.

    Hi,
    Below is the DDL taken from different database. Will this be enough ? One more thing please, what shall be the password should it be DMSYS.....since this will not be used by me but system.
    CREATE USER "DMSYS" PROFILE "DEFAULT" IDENTIFIED BY "*******" PASSWORD EXPIRE DEFAULT TABLESPACE "SYSAUX" TEMPORARY TABLESPACE "TEMP" QUOTA 204800 K ON "SYSAUX" ACCOUNT LOCK
    GRANT ALTER SESSION TO "DMSYS"
    GRANT ALTER SYSTEM TO "DMSYS"
    GRANT CREATE JOB TO "DMSYS"
    GRANT CREATE LIBRARY TO "DMSYS"
    GRANT CREATE PROCEDURE TO "DMSYS"
    GRANT CREATE PUBLIC SYNONYM TO "DMSYS"
    GRANT CREATE SEQUENCE TO "DMSYS"
    GRANT CREATE SESSION TO "DMSYS"
    GRANT CREATE SYNONYM TO "DMSYS"
    GRANT CREATE TABLE TO "DMSYS"
    GRANT CREATE TRIGGER TO "DMSYS"
    GRANT CREATE TYPE TO "DMSYS"
    GRANT CREATE VIEW TO "DMSYS"
    GRANT DROP PUBLIC SYNONYM TO "DMSYS"
    GRANT QUERY REWRITE TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_JOBS_RUNNING" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_REGISTRY" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_SYS_PRIVS" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_TAB_PRIVS" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_TEMP_FILES" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_LOCK" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_REGISTRY" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_SYSTEM" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_SYS_ERROR" TO "DMSYS"
    GRANT DELETE ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT INSERT ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT SELECT ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT UPDATE ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT SELECT ON "SYS"."V_$PARAMETER" TO "DMSYS"
    GRANT SELECT ON "SYS"."V_$SESSION" TO "DMSYS"
    The other database has the DMSYS and the status is EXPIRED & LOCKED but I'm still able to take the dump using datapump??

  • Not able to access database from a remote machine using SQL Server Management Studio

    Hi,
    I have a DB_BOX with SQL Server 2008 R2 installed. I can access the databases on the local machine using SQL Server Management Studio but it is not accessible from other machines, though the machines are in same domain.
    I have remote enabled on SQL Server box, TCP enabled, firewall off. I checked with IP Address too, all SQL Server services are running.
    The SQL Server log shows the message
    The requested service has been stopped or disabled and is unavailable at this time. The connection has been closed.
    I get the below message in SSMS from remote machine.
    Details of error message are
    ===================================
    Cannot connect to DB_BOX.
    ===================================
    A connection was successfully established with the server, but then an error occurred during the login process. (provider: TCP Provider, error: 0 - The specified network name is no longer available.) (.Net SqlClient Data Provider)
    For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&EvtSrc=MSSQLServer&EvtID=64&LinkId=20476
    Server Name: DB_BOX
    Error Number: 64
    Severity: 20
    State: 0
    Program Location:
       at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
       at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
       at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error)
       at System.Data.SqlClient.TdsParserStateObject.ReadSni(DbAsyncResult asyncResult, TdsParserStateObject stateObj)
       at System.Data.SqlClient.TdsParserStateObject.ReadNetworkPacket()
       at System.Data.SqlClient.TdsParserStateObject.ReadBuffer()
       at System.Data.SqlClient.TdsParserStateObject.ReadByte()
       at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
       at System.Data.SqlClient.SqlInternalConnectionTds.CompleteLogin(Boolean enlistOK)
       at System.Data.SqlClient.SqlInternalConnectionTds.AttemptOneLogin(ServerInfo serverInfo, String newPassword, Boolean ignoreSniOpenTimeout, Int64 timerExpire, SqlConnection owningObject)
       at System.Data.SqlClient.SqlInternalConnectionTds.LoginNoFailover(String host, String newPassword, Boolean redirectedUserInstance, SqlConnection owningObject, SqlConnectionString connectionOptions, Int64 timerStart)
       at System.Data.SqlClient.SqlInternalConnectionTds.OpenLoginEnlist(SqlConnection owningObject, SqlConnectionString connectionOptions, String newPassword, Boolean redirectedUserInstance)
       at System.Data.SqlClient.SqlInternalConnectionTds..ctor(DbConnectionPoolIdentity identity, SqlConnectionString connectionOptions, Object providerInfo, String newPassword, SqlConnection owningObject, Boolean redirectedUserInstance)
       at System.Data.SqlClient.SqlConnectionFactory.CreateConnection(DbConnectionOptions options, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningConnection)
       at System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection, DbConnectionPoolGroup poolGroup)
       at System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection)
       at System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory)
       at System.Data.SqlClient.SqlConnection.Open()
       at Microsoft.SqlServer.Management.SqlStudio.Explorer.ObjectExplorerService.ValidateConnection(UIConnectionInfo ci, IServerType server)
       at Microsoft.SqlServer.Management.UI.ConnectionDlg.Connector.ConnectionThreadUser()

    Sorry, missed the message from the errorlog in the original post. You shouldn't have included that big .Net dump that hid the important facts. :-)
    My first Google attempt on that message (which I have never seen before) suggests that the TCP Enpoint is stopped, so try this:
    ALTER ENDPOINT [TSQL Default TCP]
    STATE=STARTED;
    Erland Sommarskog, SQL Server MVP, [email protected]
    This solves the problem. Thanks...

  • How to send PLAIN text email from sp_send_dbmail using SQL Server 2008 R2?

    I have configured Database Mail in SQL Server 2008 R2 (64) and it sends emails just fine. However the destination is recieving the body of thes message as Base 64 encoding.
    Snippet:
    EXEC msdb..sp_send_dbmail @profile_name='Outmail', @recipients = @recpts, @subject = @subj, @body_format = 'TEXT', @body=@body2;
    As you can see I set @body_format to TEXT, but it still sends Base 64. I need to send plain ASCII text. How can I change this?

    I am having this exact issue too.  I have a trigger that sends an email using a stored procedure.  The dbmail is sending the email using the Exchange 2010 server SMTP.  It does not login so the email is relayed as user = "Anonymous". 
    The SQL dbmail email header is showing the following:
    Content-Type: text/plain; charset="utf-8"
    Content-Transfer-Encoding: base64
    MIME-Version: 1.0
    X-MS-Exchange-Organization-AuthSource: MX01.domain.local
    X-MS-Exchange-Organization-AuthAs: Anonymous
    Here is the header if I send via Outlook profile using the legacy program I am trying to automate using SQL Server 2008R2:
    Content-Type: application/ms-tnef; name="winmail.dat"
    Content-Transfer-Encoding: binary
    MIME-Version: 1.0
    X-MS-Has-Attach:
    X-MS-Exchange-Organization-SCL: -1
    X-MS-TNEF-Correlator: <[email protected]>
    MIME-Version: 1.0
    X-MS-Exchange-Organization-AuthSource: MX01.domain.local
    X-MS-Exchange-Organization-AuthAs: Internal
    X-MS-Exchange-Organization-AuthMechanism: 03
    X-Originating-IP: [192.168.13.66]
    Any help would be greatly appreciated!

  • Issue while loading a csv file using sql*loader...

    Hi,
    I am loading a csv file using sql*loader.
    On the number columns where there is data populated in them, decimal number/integers, the row errors out on the error -
    ORA-01722: invalid number
    I tried checking the value picking from the excel,
    and found the chr(13),chr(32),chr(10) values characters on the value.
    ex: select length('0.21') from dual is giving a value of 7.
    When i checked each character as
    select ascii(substr('0.21',5,1) from dual is returning a value 9...etc.
    I tried the following command....
    "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
    to remove all the non-number special characters. But still facing the error.
    Please let me know, any solution for this error.
    Thanks in advance.
    Kiran

    control file:
    OPTIONS (ROWS=1, ERRORS=10000)
    LOAD DATA
    CHARACTERSET WE8ISO8859P1
    INFILE '$Xx_TOP/bin/ITEMS.csv'
    APPEND INTO TABLE XXINF.ITEMS_STAGE
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    ItemNum                    "trim(replace(replace(:ItemNum,chr(9),''),chr(13),''))",
    cross_ref_old_item_num               "trim(replace(replace(:cross_ref_old_item_num,chr(9),''),chr(13),''))",
    Mas_description               "trim(replace(replace(:Mas_description,chr(9),''),chr(13),''))",
    Mas_long_description               "trim(replace(replace(:Mas_long_description,chr(9),''),chr(13),''))",
    Org_description               "trim(replace(replace(:Org_description,chr(9),''),chr(13),''))",
    Org_long_description               "trim(replace(replace(:Org_long_description,chr(9),''),chr(13),''))",
    user_item_type                    "trim(replace(replace(:user_item_type,chr(9),''),chr(13),''))",
    organization_code               "trim(replace(replace(:organization_code,chr(9),''),chr(13),''))",
    primary_uom_code               "trim(replace(replace(:primary_uom_code,chr(9),''),chr(13),''))",
    inv_default_item_status          "trim(replace(replace(:inv_default_item_status,chr(9),''),chr(13),''))",
    inventory_item_flag               "trim(replace(replace(:inventory_item_flag,chr(9),''),chr(13),''))",
    stock_enabled_flag               "trim(replace(replace(:stock_enabled_flag,chr(9),''),chr(13),''))",
    mtl_transactions_enabled_flag          "trim(replace(replace(:mtl_transactions_enabled_flag,chr(9),''),chr(13),''))",
    revision_qty_control_code          "trim(replace(replace(:revision_qty_control_code,chr(9),''),chr(13),''))",
    reservable_type               "trim(replace(replace(:reservable_type,chr(9),''),chr(13),''))",
    check_shortages_flag               "trim(replace(replace(:check_shortages_flag,chr(9),''),chr(13),''))",
    shelf_life_code               "trim(replace(replace(replace(replace(:shelf_life_code,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    shelf_life_days               "trim(replace(replace(replace(replace(:shelf_life_days,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    lot_control_code               "trim(replace(replace(:lot_control_code,chr(9),''),chr(13),''))",
    auto_lot_alpha_prefix               "trim(replace(replace(:auto_lot_alpha_prefix,chr(9),''),chr(13),''))",
    start_auto_lot_number               "trim(replace(replace(:start_auto_lot_number,chr(9),''),chr(13),''))",
    negative_measurement_error          "trim(replace(replace(replace(replace(:negative_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    positive_measurement_error          "trim(replace(replace(replace(replace(:positive_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    serial_number_control_code          "trim(replace(replace(:serial_number_control_code,chr(9),''),chr(13),''))",
    auto_serial_alpha_prefix          "trim(replace(replace(:auto_serial_alpha_prefix,chr(9),''),chr(13),''))",
    start_auto_serial_number          "trim(replace(replace(:start_auto_serial_number,chr(9),''),chr(13),''))",
    location_control_code               "trim(replace(replace(:location_control_code,chr(9),''),chr(13),''))",
    restrict_subinventories_code          "trim(replace(replace(:restrict_subinventories_code,chr(9),''),chr(13),''))",
    restrict_locators_code               "trim(replace(replace(:restrict_locators_code,chr(9),''),chr(13),''))",
    bom_enabled_flag               "trim(replace(replace(:bom_enabled_flag,chr(9),''),chr(13),''))",
    costing_enabled_flag               "trim(replace(replace(:costing_enabled_flag,chr(9),''),chr(13),''))",
    inventory_asset_flag               "trim(replace(replace(:inventory_asset_flag,chr(9),''),chr(13),''))",
    default_include_in_rollup_flag          "trim(replace(replace(:default_include_in_rollup_flag,chr(9),''),chr(13),''))",
    cost_of_goods_sold_account          "trim(replace(replace(:cost_of_goods_sold_account,chr(9),''),chr(13),''))",
    std_lot_size                    "trim(replace(replace(replace(replace(:std_lot_size,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    sales_account                    "trim(replace(replace(:sales_account,chr(9),''),chr(13),''))",
    purchasing_item_flag               "trim(replace(replace(:purchasing_item_flag,chr(9),''),chr(13),''))",
    purchasing_enabled_flag          "trim(replace(replace(:purchasing_enabled_flag,chr(9),''),chr(13),''))",
    must_use_approved_vendor_flag          "trim(replace(replace(:must_use_approved_vendor_flag,chr(9),''),chr(13),''))",
    allow_item_desc_update_flag          "trim(replace(replace(:allow_item_desc_update_flag,chr(9),''),chr(13),''))",
    rfq_required_flag               "trim(replace(replace(:rfq_required_flag,chr(9),''),chr(13),''))",
    buyer_name                    "trim(replace(replace(:buyer_name,chr(9),''),chr(13),''))",
    list_price_per_unit               "trim(replace(replace(replace(replace(:list_price_per_unit,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    taxable_flag                    "trim(replace(replace(:taxable_flag,chr(9),''),chr(13),''))",
    purchasing_tax_code               "trim(replace(replace(:purchasing_tax_code,chr(9),''),chr(13),''))",
    receipt_required_flag               "trim(replace(replace(:receipt_required_flag,chr(9),''),chr(13),''))",
    inspection_required_flag          "trim(replace(replace(:inspection_required_flag,chr(9),''),chr(13),''))",
    price_tolerance_percent          "trim(replace(replace(replace(replace(:price_tolerance_percent,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    expense_account               "trim(replace(replace(:expense_account,chr(9),''),chr(13),''))",
    allow_substitute_receipts_flag          "trim(replace(replace(:allow_substitute_receipts_flag,chr(9),''),chr(13),''))",
    allow_unordered_receipts_flag          "trim(replace(replace(:allow_unordered_receipts_flag,chr(9),''),chr(13),''))",
    receiving_routing_code               "trim(replace(replace(:receiving_routing_code,chr(9),''),chr(13),''))",
    inventory_planning_code          "trim(replace(replace(:inventory_planning_code,chr(9),''),chr(13),''))",
    min_minmax_quantity               "trim(replace(replace(replace(replace(:min_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    max_minmax_quantity               "trim(replace(replace(replace(replace(:max_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    planning_make_buy_code               "trim(replace(replace(:planning_make_buy_code,chr(9),''),chr(13),''))",
    source_type                    "trim(replace(replace(:source_type,chr(9),''),chr(13),''))",
    mrp_safety_stock_code               "trim(replace(replace(:mrp_safety_stock_code,chr(9),''),chr(13),''))",
    material_cost                    "trim(replace(replace(:material_cost,chr(9),''),chr(13),''))",
    mrp_planning_code               "trim(replace(replace(:mrp_planning_code,chr(9),''),chr(13),''))",
    customer_order_enabled_flag          "trim(replace(replace(:customer_order_enabled_flag,chr(9),''),chr(13),''))",
    customer_order_flag               "trim(replace(replace(:customer_order_flag,chr(9),''),chr(13),''))",
    shippable_item_flag               "trim(replace(replace(:shippable_item_flag,chr(9),''),chr(13),''))",
    internal_order_flag               "trim(replace(replace(:internal_order_flag,chr(9),''),chr(13),''))",
    internal_order_enabled_flag          "trim(replace(replace(:internal_order_enabled_flag,chr(9),''),chr(13),''))",
    invoice_enabled_flag               "trim(replace(replace(:invoice_enabled_flag,chr(9),''),chr(13),''))",
    invoiceable_item_flag               "trim(replace(replace(:invoiceable_item_flag,chr(9),''),chr(13),''))",
    cross_ref_ean_code               "trim(replace(replace(:cross_ref_ean_code,chr(9),''),chr(13),''))",
    category_set_intrastat               "trim(replace(replace(:category_set_intrastat,chr(9),''),chr(13),''))",
    CustomCode                    "trim(replace(replace(:CustomCode,chr(9),''),chr(13),''))",
    net_weight                    "trim(replace(replace(replace(replace(:net_weight,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    production_speed               "trim(replace(replace(:production_speed,chr(9),''),chr(13),''))",
    LABEL                         "trim(replace(replace(:LABEL,chr(9),''),chr(13),''))",
    comment1_org_level               "trim(replace(replace(:comment1_org_level,chr(9),''),chr(13),''))",
    comment2_org_level               "trim(replace(replace(:comment2_org_level,chr(9),''),chr(13),''))",
    std_cost_price_scala               "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
    supply_type                    "trim(replace(replace(:supply_type,chr(9),''),chr(13),''))",
    subinventory_code               "trim(replace(replace(:subinventory_code,chr(9),''),chr(13),''))",
    preprocessing_lead_time          "trim(replace(replace(replace(replace(:preprocessing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
    processing_lead_time                "trim(replace(replace(replace(replace(:processing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
    wip_supply_locator               "trim(replace(replace(:wip_supply_locator,chr(9),''),chr(13),''))"
    Sample data from csv file.
    "9901-0001-35","390000","JMKL16 Pipe bend 16 mm","","JMKL16 Putkikaari 16 mm","","AI","FJE","Ea","","","","","","","","","","","","","","","","","","","","","","","","","21-21100-22200-00000-00000-00-00000-00000","0","21-11100-22110-00000-00000-00-00000-00000","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","0.1","Pull","AFTER PROD","","","Locator for Production"
    The load errors out on especially two columns :
    1) std_cost_price_scala
    2) list_price_per_unit
    both are number columns.
    And when there is data being provided on them. It errors out. But, if they are holding null values, the records go through fine.
    Message was edited by:
    KK28

  • Using sql loader to load data

    Hi,
    My name is Roshan. I'm pretty new to Oracle.
    I'm trying to load some rows from a comma delimited ascii file to a table with 3 varchar2 fields using sql*loader. Can anybody give me an example ?
    Thanks,
    Roshan.

    Here is some documentation that may help: http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch10.htm
    Regards,
    OTN

  • Inserting Special Character using SQL*Plus

    I am trying to insert special character like ® using SQL*Plus but it is inserting a .(dot) instead.
    Environment:
    Oracle Enterprise version : 9.2.0.3.0
    Sun Solaris 8
    Any help will be appreciated.
    Regards,
    Nirmalya

    That's the reason I use ASCII values always for special characters.
    SQL> create table sample1(col1 varchar2(50))
      2  /
    Table created.
    SQL> insert into sample1
      2  select 'The temperature outside is 20'||chr(176)||' centigrade' col1
      3  from dual
      4  /
    1 row created.
    SQL> commit
      2  /
    Commit complete.
    SQL> select * from sample1
      2  /
    COL1
    The temperature outside is 20° centigrade
    1 row selected.
    SQL> drop table sample1 purge
      2  /
    Table dropped.
    SQL> Cheers
    Sarma.

  • How do I make a connection to SQL Server Express using SQL Developer 1.5.3?

    How do I make a connection to SQL Server Express using SQL Developer 1.5.3, if it's possible at all?
    Also, I received a SQL Server dump file of some sort. Is there any way to read this file in SQL Developer and migrate the db to Oracle? Unfortunately I don't have raw DDL and DML for this database--just some sort of a binary dump file from SQL Server.
    Thanks.
    Dana

    Are you using SQL Developer 1.5.3 or 1.5.4?
    If using an older version (i.e. 1.5.3), then you'll have to add a DLL to your environment. This is [ntlauth.dll].
    If you don't find this DLL in your SQL Developer directory, then obtain the JDBC SQL SERVER Pluggin (jtds-1.2.2-dist.zip) from:
    http://sourceforge.net/project/showfiles.php?group_id=33291&package_id=25350
    Copy the ntlmauth.dll in the ...\\...\ide\lib directory (no need to register it).
    Note: this approach should work if using DQL Developer 1.5.3 and SQL Server Express 2008
    Regards,
    M. R.

  • Dump with SQL Sentences

    Hi
    I 'm using SQL sentences in a customer program, for example:
            EXEC SQL.
              INSERT INTO ORAGNI.V_SALDOS_FATPO@INT
              VALUES (:TI_SALIDA-PERNR,
                      :TI_SALIDA-NAME1,
                      TO_DATE(:C_FECHA, 'dd.mm.yyyy'),
                      :TI_SALIDA-DISPO,
                      :TI_SALIDA-SALDOP,
                      :TI_SALIDA-SALDOAU,
                      :TI_SALIDA-SALDOAO,
                      :TI_SALIDA-SALDOAPO,
                      :TI_SALIDA-SALDOAE,
                      :TI_SALIDA-SALDOAD,
                      :TI_SALIDA-DARMC,
                      :TI_SALIDA-SALDOT,
    *                :TI_SALIDA-SYSID,
                      :V_FONDO,
                      :TI_SALIDA-ULTIMO_DIV)
            ENDEXEC.
        EXEC SQL.
          COMMIT WORK.
        ENDEXEC.
    But i have a dump when I run this program, this is part of the dump:
    Errores tiempo ejecucióDBIF_DSQL2_SQL_ERROR         
           ocurrido el     07.10.2008 a  14:59:31
    SQL error 1722 occurred when executing EXEC SQL.                              
    -----------------¿Qué ha sucedido?
    The error occurred in the current database connection "DEFAULT".              
    Other Dump:
    Errores tiempo ejecucióDBIF_DSQL2_SQL_ERROR         
           ocurrido el     07.10.2008 a  14:59:31
    SQL error 1722 occurred when executing EXEC SQL.                              
    -----------------¿Qué ha sucedido?
    The error occurred in the current database connection "DEFAULT".              
    How can i do to catch this mistake  ?? maybe i can to use catch ... endcatch ¿how can i use it?
    Regards
    Gregory

    Hi,
    01722, 00000, "invalid number"
    // *Cause:
    // *Action:
    You must be writing invalid number into a numeric field, Please check the values in the individual fields using trace and compare with the Field definition to find the error.
    Thanks and Best Regards,
    Sunil.

  • Dump using Dynamic WHERE condition (EXISTS + subquery)

    Hi experts,
    I want to use dynamic WHERE here, but I got a problem when using EXISTS + subuery.
    Here is my code snippet:
    And the actual SQL sentence should be like this:
    When I execute this program, an exception CX_SY_DYNAMIC_OSQL_SEMANTICS is raised and the program dumps.
    Dump Information:
    Short text
       A dynamically specified column name is unknown.
    Error analysis
        An exception occurred that is explained in detail below.
        The exception, which is assigned to class 'CX_SY_DYNAMIC_OSQL_SEMANTICS', was
         not caught in
        procedure "FRM_GET_ALL_PROD_ORDERS" "(FORM)", nor was it propagated by a
         RAISING clause.
        Since the caller of the procedure could not have anticipated that the
        exception would occur, the current program is terminated.
        The reason for the exception is:
        An Open SQL clause was specified dynamically. The contained field name
        "EXISTS" does not exist in any of the database tables from the FROM clause.
    I'm confused because the plain text of my WHERE condition looks good. Could you tell me why I encountered this problem?
    Many thanks,
    Shelwin

    Hi Shelwin !
    Testing your code i dont have dump, using this:
    REPORT z_test MESSAGE-ID z_test_msgs.
    DATA: lt_where_tab1 TYPE STANDARD TABLE OF edpline WITH HEADER LINE,
          lt_where_tab2 TYPE STANDARD TABLE OF edpline WITH HEADER LINE,
          gt_detail     TYPE STANDARD TABLE OF caufv WITH HEADER LINE.
    CONCATENATE 'EXISTS (SELECT J_1~STAT FROM JEST AS J_1 INNER JOIN TJ02T AS T_1'
                'ON J_1~STAT EQ T_1~ISTAT'
                'WHERE J_1~OBJNR EQ CAUFV~OBJNR'
                'AND J_1~INACT EQ '' '''
                'AND T_1~SPRAS EQ ''E'''
                'AND T_1~TXT04 EQ ''CRTD'')'
                INTO lt_where_tab1.
    CONCATENATE 'NOT EXISTS (SELECT J_2~STAT FROM JEST AS J_2 INNER JOIN TJ02T AS T_2'
                'ON J_2~STAT EQ T_2~ISTAT'
                'WHERE J_2~OBJNR EQ CAUFV~OBJNR'
                'AND J_2~INACT EQ '' '''
                'AND T_2~SPRAS EQ ''E'''
                'AND T_1~TXT04 IN (''TECO'' , ''DLFL'', ''DLV'', ''CLSD''))'
                INTO lt_where_tab2.
    SELECT aufnr ftrms gltrs
      FROM caufv
      INTO CORRESPONDING FIELDS OF TABLE gt_detail
    WHERE (lt_where_tab1)
       AND (lt_where_tab2).
    BREAK-POINT.
    Regards,
    Edited for error on image.

  • How to load a default value in to a column when using sql loader

    Im trying to load from a flat file using sql loader.
    for 1 column i need to update using a default value
    how to go about this?

    Hi!
    try this code --
    LOAD DATA
       INFILE 'sample.dat'
       REPLACE
       INTO TABLE emp
       empno   POSITION(01:04) INTEGER EXTERNAL NULLIF empno=BLANKS,
       ename   POSITION(06:15)  CHAR,
       job         POSITION(17:25)  CHAR,
       mgr       POSITION(27:30)  INTEGER EXTERNAL NULLIF mgr=BLANKS,
       sal        POSITION(32:39)  DECIMAL EXTERNAL NULLIF sal=BLANKS,
       comm   POSITION(41:48)  DECIMAL EXTERNAL DEFAULTIF comm = 100,
       deptno  POSITION(50:51)  INTEGER EXTERNAL NULLIF deptno=BLANKS,
       hiredate POSITION(52:62) CONSTANT SYSDATE
      )-hope this will solve ur purpose.
    Regards.
    Satyaki De.

  • Pinning frequently used sqls

    Hi all,
    Please let me know if the steps involved to pin frequently used sql statement in library cache to avoid reparsing
    Thanks for all your help.

    Frequently used SQL ought to be retained in the library cache without pinning. Oracle only starts flushing stuff from the cache once it's full and then it ages out the least-recently used queries.
    So, if you have a persistent problem with excessive reloading of frequently used queries you may need to reconsider the value of your shared_pool_size.
    Cheers, APC

  • Connecting to Oracle DB on Ubuntu 8.04 using SQL Developer

    Hi,
    I managed to install Oracle 11g on Ubuntu 8.04 using this guide:
    http://www.pythian.com/blogs/968/installing-oracle-11g-on-ubuntu-804-lts-hardy-heron
    But I don't remember the Username or SID, that's why I keep on getting ORA-12505 from SQL DEVELOPER
    How can I get those missing informations?

    But I don't remember the Username or SIDAssuming your DB is up and running, you can find out the SID by :
    $ ps -ef | grep smon | grep -v grep
    oracle   15298     1  1 10:59 ?        00:00:00 ora_smon_db11
    $ here db11 is the SID.
    About username, you can create one. Use Sql*Plus and OS authentication :
    $ export ORACLE_SID=<your SID>
    $ sqlplus / as sysdba
    SQL> CREATE USER

Maybe you are looking for

  • Consinment

    hi friends for general sales order process we will give proforma invoice or original invoice for the goods transporter. he will use those invoice while crosing the boarders or some checking points but where as in consinement sales order process the i

  • User specific settings through SHD0

    Hi I have make some fields disabled for editing. I have done this through SHD0. Now i have to keep this setting for some specific users. Please guide for this . Regards Raksha

  • Difference Matte Key aspect ratio PP2.0

    I'd appreciate some help with the Difference Matte Key using Premiere Pro 2.0. I followed the instructions in the help guide to set up the Difference Matte, and it worked perfectly, except... the Matte doesn't match the 16:9 aspect ratio I'm using, i

  • Anyone Know Anything About Freecom DVD Burning Products?

    For a while I have been thinking about getting an external burner to relieve the wear and tear on my Superdrive. LaCies have always seemed prohibitively expensive but this morning I saw this Freecom 31654 going on Amazon for a third of the LaCie pric

  • Change Pointers and Recovery Index

    Hi, Please give me any more details/docs/links about the Recovery index,indexing and Change Pointers. Is there any configuration needs to be done for change pointers and indexing. Regards, Krishna