Write to ERP Tables on HANA DB using SQL script

Hello All,
We are using HANA as our primary database for ABAP system and trying to feed the data to ABAP tables using SQL script and experiencing authorization errors . Please see below for more details.
Scenario.
I am getting no authorised error when i try to write some data to Z* tables using SQL script in HANA studio.But I am able to create new tables in the same schema.
As shown above Query1: SAPSR1 is the schema which contains underline ABAP tables. ZGSA is existing table and now i am trying to insert new rows into it.
Query 2&3: Creating new tables in SAPSR1 works fine.
Can you please suggest me whether it is right approach or i need to have RFC to update these table from some other tool/app?.
Thanks in advance,
Naresh

Hi Naresh,
Obi Wan would now probably say: "this is not the functionality you're looking for".
Even though you are working with Z-tables you really don't want to start messing with those from outside the context of the NetWeaver system.
Instead you want to keep the control over all tables in the NetWeaver schema completely  to the SAP<sid> user and NetWeaver.
For your data loading scenario, just write a simple ABAP report with native sql or an AMDP to do the copying of the data for you.
Don't spread your code across the landscape and don't loosen access restrictions on your schema.
- Lars

Similar Messages

  • Question about creating new tables using SQL script in WebLogic Server

    Hi,
    I am new to WebLogic and I am following a book Java EE Development with Eclipse published by PACKT Publishing to learn
    Java EE.  I have installed Oracle Enterprise Pack for Eclipse on the PC and I am able to log into the WebLogic Server Administration Console
    and set up a Data Source.  However the next step is to create tables for the database.  The book says that the tables can be created using
    SQL script run from the SQL command line.
    I cannot see any way of inputting SQL script into the WebLogic Server Admistration Console.  Aslo there is no SQL Command line in DOS.
    Thanks  for your help.
    Brian.

    Sounds like you are to run the scripts provided by a tutorial to create the tables, right?  In that case, you may need to install an Oracle client to connect to your database.  The client is automatically installed with the database, so if you have access to the server that hosts the database, you should be able to run SQLplus from there.
    As far as I know, there is no way to run a script from the Admin Console.  I could be wrong, however.

  • Set maximum server memory by using sql scripts

    Dear all
    How to set maximum server memory by using sql scripts in sql server 2014? Thx a lot
    Best regards,
    Wallace

    You can use
    sys.Sp_Configure to set max server memory
    Here are some recommendation for Max Server memory based on RAM size
    GB
    MB
    Recommended Setting
    Command
    16
    16384
    14745
    EXEC sys.sp_configure 'max server memory (MB)', '14745'; RECONFIGURE;
    32
    32768
    29491
    EXEC sys.sp_configure 'max server memory (MB)', '29491'; RECONFIGURE;
    64
    65536
    58982
    EXEC sys.sp_configure 'max server memory (MB)', '58982'; RECONFIGURE;
    128
    131072
    117964
    EXEC sys.sp_configure 'max server memory (MB)', '117964'; RECONFIGURE;
    256
    262144
    235929
    EXEC sys.sp_configure 'max server memory (MB)', '235929'; RECONFIGURE;
    512
    524288
    471859
    EXEC sys.sp_configure 'max server memory (MB)', '471859'; RECONFIGURE;
    1024
    1048576
    943718
    EXEC sys.sp_configure 'max server memory (MB)', '943718'; RECONFIGURE;
    2048
    2097152
    1887436
    EXEC sys.sp_configure 'max server memory (MB)', '1887436'; RECONFIGURE;
    4096
    4194304
    3774873
    EXEC sys.sp_configure'max server memory (MB)', '3774873'; RECONFIGURE;
    Hope this will help
    Glad to help! Please remember to accept the answer if you found it helpful. It will be useful for future readers having same issue.

  • Unable to capture the adf table column sort icons using open script tool

    Hi All,
    I am new to OATS and I am trying to create script for testing ADF application using open script tool. I face issues in recording two events.
    1. I am unable to record the event of clicking adf table column sort icons that exist on the column header. I tried to use the capture tool, but that couldn't help me.
    2. The second issue is I am unable to capture the panel header text. The component can be identified but I was not able to identify the supporting attribute for the header text.

    Hi keerthi,
    1. I have pasted the code for the first issue
    web
                             .button(
                                       122,
                                       "/web:window[@index='0' or @title='Manage Network Targets - Oracle Communications Order and Service Management - Order and Service Management']/web:document[@index='0' or @name='1824fhkchs_6']/web:form[@id='pt1:_UISform1' or @name='pt1:_UISform1' or @index='0']/web:button[@id='pt1:MA:0:n1:1:pt1:qryId1::search' or @value='Search' or @index='3']")
                             .click();
                        adf
                        .table(
                                  "/web:window[@index='0' or @title='Manage Network Targets - Oracle Communications Order and Service Management - Order and Service Management']/web:document[@index='0' or @name='1c9nk1ryzv_6']/web:ADFTable[@absoluteLocator='pt1:MA:n1:pt1:pnlcltn:resId1']")
                        .columnSort("Ascending", "Name" );
         }

  • Backingup Database using SQL script

    Hello,
    Let me start by saying I am new to Oracle, and I am trying to learn how to do a project where I must write an SQL script that can be used to back-up all the database files (i.e. control, redo log, and data files).  Assuming that all of the files are stored in one folder, the source and destination locations of these files must be provided as "substitution variables".  The script must implement the following tasks (directions)
    Connect as user SYS with SYSDBA role
    Shutdown the database
    Copy the database files from the source location and store in the destination location
    Restart the database
    Connect as user SCOTT
    In addition the script must use the "HOST" command to issue the operating system command to perform the copy task.
    *Since this is just a learning project, and not a real scenario, I cannot run my script in SQLplus to verify if it is correct. This is why I am asking for some professional advice. I am currently working with Oracle 11g Enterprise running on a client Windows XP OS.
    Below is what I have determined is a cold backup, please let me know if what I am doing is correct so far, and if not can you please steer me in the right direction.
    connect SYS/<password> as SYSDBA
    shutdown;
    HOST copy from &source_file to &destination_file;       --this is the line that is confusing me.
    connect SYS/<password> as SYSDBA
    startup;
    connect scott/tiger
    thank you in advance for your time and input.

    Thank you Brian and Frank. I agree with you that RMAN is the best way to perform a backup (from what I have read over the last week or so) in a production scenario., especially if the DB needs 24/7 access. Also , thank you Brian for the helpful links.
    I currently have the free Oracle Enterprise 11g edition downloaded to my computer, running on VB with client Windows XP OS. But being new at this I find myself scared of running bad script that may break down my DB. I know I can just reload it and start over, but I am trying to approach this as carefully as possible. Although, the error codes help me understand what I am doing wrong a lot of the time.
    Also, thank you Frank for the shutdown immediate advice. I also like the idea of writing/saving data to the spool for future reference  that will definitely come in hand down the road. I didn't think about writing a .bat file to run via SQL*Plus, that would also be a great alternative.
    Continuing with the hypothetical scenario that I need to perform a very basic cold backup that requires a shutdown (with archive logging off) and a restart, do you think this script would run error free. Also, assuming the substitute variables are actual paths to where the whole of the database files are stored and their new destination file actually exists.
    --Windows XP OS
    connect SYS/<password> as SYSDBA
    shutdown IMMEDIATE;
    HOST copy &source_file &destination_file
    connect SYS/<password> as SYSDBA
    startup;
    connect scott/tiger
    Thank you again for your time and patience.
    I find myself learning more from asking questions to professionals on this site than from my own professors.

  • How to upgrade the schema on both the sites using sql scripts?

    Hi Experts,
    I need some help.
    I have two sites SITE A and SITEB, on both the sites GG is installed and running (DML Bi- directional replication). I have schema SCHEMA1 on both the sites which are configured for replication and replication is working from last one year.
    Now I wanted to upgrade the SCHEMA1 on both the sites. There are three .sql script which has many SQL commands like create table, create sequence, create procedure, insert record etc. I wanted to run that SQL file one both the sites.
    So should I stop all the process on both the sites SITEA and SITE B (including manager process) ,execute the SQL scripts on both the sites and then start the process again ? Will it work?
    Is there any other best way to do this?
    Could you please suggest the steps so that I can successfully execute theses scripts on both the site to upgrade the schema1.

    There is no blanket answer here; it depends. Main questions are:
    1. Do the SQL scripts update existing tables?
    2. Are you replicating DDL in one direction? (DDL should only be replicated one way even when doing bi-directional, do DDL should only be issued on the node capturing DDL
    3. Are you using a wildcard (*) from table names or using an explicit list?
    Easiest thing is if you're replicating DDL from A to B and apply DDL to A only. This assumes that the application writing to these tables can handle DDL changes under the covers. If these are new tables supporting new application features then you would simply enable said features after apply sql files.
    From there it gets more complicated and would need answers to the above questions before going down each line of logic. But try to remember what's really going on here. Data in one form (DDL) is being captured and send along. If the "shape" (DDL) of that data changes then the extract and replicat need to update their meta data to handle it correctly. If change data encounters a different shape than what's cached then you will become out of sync.
    I'm not sure if that makes sense but again, the answers to the questions above will be indicate where more detailed explanations should be focused. In short, we need more detail about what those scripts do and your current setup.
    Good luck,
    -joe

  • Joining 2 tables in oracle database using SQL

    I want to join 2 tables together before executing a statement.
    problem is one is a table of users, who have userID's
    and the othet table is a table of events that are owned by a userID, ie can have many events by same userID.
    i want to retrieve forename and lastname from the users table, nd the event details from the event table, and put a name against each event rather than a userID, how can i do this?
    note i want to extract EVERY SINGLE EVENT, and get the name of the user it is owned by via the userID
    heres a example of a row
    Table Users
    | UserID | Forename | Surname |
    | Y244850 | Jimmy | Conner |
    | Y256738 | Mikey | Reeves |
    Table Events
    | UserID | Date | Type | Location |
    | Y244850 | 07-Jan-01 | Holiday | Ibiza |
    | Y244850 | 15-Dec-01 | Holiday | Jamaica |
    ------------------------------------------------

    Well just exchange user_id with u.user_ud or l.user_id:
    SQL> SELECT forename, surname, TO_CHAR(id) id,
    TO_CHAR(start_date) start_date, TO_CHAR(end_date)
    end_date, type, u.user_id, location FROM leave_details l,
    user_details u WHERE l.user_id = u.user_id;
    When you just say select user_id, the database doesn't know wich user_id to use. The one in details or in users...
    (Even though you and me know its the same)
    So just be spesific and choose one.
    Sjur

  • How do I select stuff from table just created in PL/SQL script?

    When I execute a select statement, SQLplus complains about the table does not exist. But it has just been created? Dont know what is going wrong... thanks in advance...
    Code is as follows:
    DECLARE
    column_table VARCHAR(50) := 'column_table';
    r_count NUMBER(10) := 0;
    BEGIN
    column_table := UPPER(column_table);
    -- I created the table here and commit
    EXECUTE IMMEDIATE 'CREATE TABLE '|| column_table || ' (table_name varchar2(100), column_name varchar2(1000))';
    commit;
    IF someCondition THEN
    blahblah
    -- this is where SQLPLUS complains about the table or view does not exist
              select count(1) into r_count from column_table where table_name = someValue;
    blahblah
    END IF;
    END;
    /

    Thanks for the additional detail
    >
    Better way of doing this?
    >
    I sure wouldn't do it that way.
    My first approach would be to produce tables or views that show the differences. Then let the users examine the data or export the data to excel and send it to them.
    This is what I have done in the past that has worked well. This assumes that Table1_pre and Table1_pos both have primary keys
    There will be four result tables
    1. Pre_but_not_Post - this table has records (primary keys) in the Pre table that are not in the Post table
    2. Post_but_not_pre - opposite of #1 - records in Post table but not in Pre table
    3. Pre_records_changed - this table has records in Pre that are also in Post table but at least one column value is different
    4. Post_records_changed - this table has the Post table records that match #3
    A. Create a view on Table1_pre that only includes the columns to be compared.
    B. Execute four queries (not tested) to do the above data split - this lets Oracle compare the data in the two tables
    1. Pre_but_not_Post - query the key values in the Pre table that are not in the Post table creating a new table to hold the result
    CREATE TABLE PRE_BUT_NOT_POST AS
    SELECT pre.key1 from Table1_pre pre
    MINUS
    SELECT post.key1 from Table1_post post
    2. Post_but_not_Pre - query the key values in the Post table that are not in the Pre table creating a new table to hold the result
    CREATE TABLE POST_BUT_NOT_PRE AS
    SELECT post.key1 from Table1_post post
    MINUS
    SELECT pre.key1 from Table1_pre pre
    3. Pre_records_changed - query the records from the pre table that are in the post table but are different - create a new table
    CREATE TABLE PRE_RECORDS_CHANGED AS
    SELECT PRE.* FROM Table1_pre pre, Table1_post post WHERE pre.key1 = post.key1
    MINUS
    SELECT POST.* FROM Table1_post post, Table1_pre pre WHERE pre.key1 = post.key1
    4. Post_records_changed - query the records from the post table that are in the pre table but are different - create a new table
    -- left as an excercise for you
    The first two tables show you the records that aren't in one of the tables. The second two tables have the records that are different in some column.
    In the past I have merged the second two tables into another table and added a flag (PRE or POST) to indicate which table the record came from.
    Then I would export the new table in sorted order - ORDER BY KEY1, FLAG and export the file to delimited format. The file has every record that is different with the POST record immediately followed by the matching (on key) PRE record.
    I created an Excel template with a formula in every cell of every even numbered row. The formula would compare the cell value in the row to the cell value in the row immediately above - this compares PRE to POST values for each column. If the value was different the formula would turn the cell RED.
    Then I would open the delimited file in an Excel worksheet, copy the entire set of cells to the clipboard and then paste it into a copy of the Excel template.
    Voila! - Every cell that was different was hightlighted in RED and easy to spot.
    Users could examine the data at their leisure to determine what was wrong or needed to be fixed and the developer had the corresponding data in a table where it could be changed and then applied to the POST table as an update.
    The above approach is very straightforward and easy to setup and implement. Even if you don't use it as your final solution it will make it easier to confirm that whatever solution you do adopt is correct.
    I would recommend you try the above on a small number of records for one of your tables as a proof-of-concept. You should be able to easily adapt it for your particular requirements. For example, you may just need to write a report or custom query using the two tables from steps 3 and 4 above.

  • Load XML File into temporary tables using sql loader

    Hi All,
    I have an XML file as below. I need to insert the contents into a temporary staging table using sql loader. Please advice how I need to do that.
    For example Portfolios should go into a seperate table, and all the tags inside it should be populated in the columns of the table.
    Family should go into a seperate table and all the tags inside it should be populated in the columns of the table.
    Similarly offer, Products etc.
    - <ABSProductCatalog xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    - <ProductSalesHierachy>
    - <Portfolios>
    - <Portfolio productCode="P1">
      <Attribute name="CatalogProductName" value="Access" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
    - <Portfolio productCode="P2">
      <Attribute name="CatalogProductName" value="Data" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
    - <Portfolio productCode="P3">
      <Attribute name="CatalogProductName" value="Voice" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
    - <Portfolio productCode="P4">
      <Attribute name="CatalogProductName" value="Wireless" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
      </Portfolios>
    - <Families>
    - <Family productCode="F1">
      <Attribute name="CatalogProductName" value="Internet Access Services" />
      <Attribute name="Status" value="Active" />
    - <ParentHierarchy>
      <Item productCode="P1" modelType="Portfolio" />
      </ParentHierarchy>
      </Family>
    - <Family productCode="F2">
      <Attribute name="CatalogProductName" value="Local Access Services" />
      <Attribute name="Status" value="Active" />
    - <ParentHierarchy>
      <Item productCode="P2" modelType="Portfolio" />
      </ParentHierarchy>
      </Family>
      </Families>
    - <SubFamilies>
    - <SubFamily productCode="SF1">
      <Attribute name="CatalogProductName" value="Business Internet service" />
      <Attribute name="Status" value="Active" />
    - <ParentHierarchy>
      <Item productCode="F1" modelType="Family" />
      </ParentHierarchy>
      </SubFamily>
      </SubFamilies>
    - <ProductRefs>
    - <ProductRef productCode="WSP1" modelType="Wireline Sales Product">
      <ActiveFlag>Y</ActiveFlag>
    - <ProductHierarchy>
      <SalesHierarchy family="F1" subFamily="SF1" portfolio="P1" primary="Y" />
      <SalesHierarchy family="F2" portfolio="P2" primary="N" />
      <FinancialHierarchy quotaBucket="Voice" strategicProdCategory="Local Voice" />
      </ProductHierarchy>
      </ProductRef>
    - <ProductRef productCode="MSP2" modelType="Handset">
      <ActiveFlag>Y</ActiveFlag>
    - <ProductHierarchy>
      <SalesHierarchy portfolio="P4" primary="Y" />
      </ProductHierarchy>
      </ProductRef>
      </ProductRefs>
      </ProductSalesHierachy>
    - <Offers>
    - <Offer productCode="ABN">
      <OfferName>ABN</OfferName>
      <OfferDescription>ABN Description</OfferDescription>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      </Segments>
      <OfferUpdateDate>2009-11-20</OfferUpdateDate>
      <ActiveFlag>Y</ActiveFlag>
      </Offer>
    - <Offer productCode="OneNet">
      <OfferName>OneNet</OfferName>
      <OfferDescription>OneNet Description</OfferDescription>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      <Segment>PCG2</Segment>
      </Segments>
      <OfferUpdateDate>2009-11-20</OfferUpdateDate>
      <ActiveFlag>Y</ActiveFlag>
      </Offer>
      </Offers>
    - <Products>
    - <Product productCode="WSP1" modelType="Wireline Sales Product">
      <ProductName>AT&T High Speed Internet</ProductName>
      <ProductDescription>High Speed Internet</ProductDescription>
      <LegacyCoProdIndicator>SBC</LegacyCoProdIndicator>
      <RevenueCBLCode>1234B</RevenueCBLCode>
      <VolumeCBLCode>4567A</VolumeCBLCode>
      <SAARTServiceIDCode>S1234</SAARTServiceIDCode>
      <MarginPercentRequired>Y</MarginPercentRequired>
      <PercentIntl>%234</PercentIntl>
      <UOM>Each</UOM>
      <PriceType>OneTime</PriceType>
      <ProductStatus>Active</ProductStatus>
      <Compensable>Y</Compensable>
      <Jurisdiction>Everywhere</Jurisdiction>
      <ActiveFlag>Y</ActiveFlag>
    - <Availabilities>
      <Availability>SE</Availability>
      <Availability>E</Availability>
      </Availabilities>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      </Segments>
      <VDIndicator>Voice</VDIndicator>
      <PSOCCode>PSOC 1</PSOCCode>
      <USBilled>Y</USBilled>
      <MOWBilled>N</MOWBilled>
      <ProductStartDate>2009-11-20</ProductStartDate>
      <ProductUpdateDate>2009-11-20</ProductUpdateDate>
      <ProductEndDate>2010-11-20</ProductEndDate>
    - <AliasNames>
      <AliasName>AT&T HSI</AliasName>
      <AliasName>AT&T Fast Internet</AliasName>
      </AliasNames>
    - <OfferTypes>
      <OfferType productCode="ABN" endDate="2009-11-20" />
      <OfferType productCode="OneNet" />
      </OfferTypes>
    - <DynamicAttributes>
    - <DynamicAttribute dataType="String" defaultValue="2.5 Mbps" name="Speed">
      <AttrValue>1.5 Mbps</AttrValue>
      <AttrValue>2.5 Mbps</AttrValue>
      <AttrValue>3.5 Mbps</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="TransportType">
      <AttrValue>T1</AttrValue>
      </DynamicAttribute>
      </DynamicAttributes>
      </Product>
    - <Product productCode="MSP2" modelType="Handset">
      <ProductName>Blackberry Bold</ProductName>
      <ProductDescription>Blackberry Bold Phone</ProductDescription>
      <LegacyCoProdIndicator />
      <RevenueCBLCode />
      <VolumeCBLCode />
      <SAARTServiceIDCode />
      <MarginPercentRequired />
      <PercentIntl />
      <UOM>Each</UOM>
      <PriceType />
      <ProductStatus>Active</ProductStatus>
      <Compensable />
      <Jurisdiction />
      <ActiveFlag>Y</ActiveFlag>
    - <Availabilities>
      <Availability />
      </Availabilities>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      </Segments>
      <VDIndicator>Voice</VDIndicator>
      <PSOCCode />
      <USBilled />
      <MOWBilled />
      <ProductStartDate>2009-11-20</ProductStartDate>
      <ProductUpdateDate>2009-11-20</ProductUpdateDate>
    - <AliasNames>
      <AliasName />
      </AliasNames>
    - <OfferTypes>
      <OfferType productCode="ABN" />
      </OfferTypes>
    - <DynamicAttributes>
    - <DynamicAttribute dataType="String" name="StlmntContractType">
      <AttrValue />
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="BMG 2 year price">
      <AttrValue>20</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="MSRP">
      <AttrValue>40</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="BMGAvailableType">
      <AttrValue />
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="ProductId">
      <AttrValue>123456</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="modelSource">
      <AttrValue>product</AttrValue>
      </DynamicAttribute>
      </DynamicAttributes>
      </Product>
      </Products>
      <CatalogChanged>Y</CatalogChanged>
      </ABSProductCatalog>

    Two options that come to mind. Others exist.
    #1 - {thread:id=474031}, which is basically storing the XML in an Object Relational structure for parsing
    #2 - Dump the XML into either an XMLType based table or column and use SQL (with XMLTable) to create a view that parses the data. This would be the same as the view shown in the above post.
    Don't use sql*loader to parse the XML. I was trying to find a post from mdrake about that but couldn't. In short, sql*loader was not build as an XML parser so don't try to use it that way.

  • Create temporary Tables using SQL

    Hello,
    I'm wondering if SAP allows the creation of new Tables without SDK objects,
    I want to create temporary tables using SQL scripts an compile them when an specific addon is connected and erase them when the addon disconnects,
    Do you think this is allowed?.
    thanks,
    Gabriela

    You could always have a second DB to create your temp tables in.  This is the way I've done this, as well as created my own views and stored procedures in it.  No updating of the primary table necessary.  The way I named things was:
    Company_DB - Company Database
    Company_DB-Extern - My own stuff
    Then, in sap, you can just do [Company_DB-Extern]..Object to call it, or you do the same from withing your project.

  • Sqlldr flat file to external table using shell scripts

    Hi,
    Has anyone done this before? Please give me a hand.
    Thanks!

    Thanks Justin.
    When do I need to create the external table EMP_STAGING ?
    These are my steps so far:
    - shell script to crate the flat file (but I need to change the table name to EMP_STAGING)
    - use a script to call sqlldr to load the flat file into the external table
    - then the script will call the MERGE sql script to merge the data from the external table into the database table
    Am I on the right track?
    In which stage should I create and drop the external table?
    Thanks!

  • Populating table using SQL*Plus

    Hi,
    I am trying to populate my destination table from the source using SQL*Plus. I keep get error message. The source table has 6 columns and the destination table has 11 columns. Data_type are different in both table. The following is my script and the output of the script:
    insert into bim_expense_element
    (select table_type_id, table_code_desc,
    table_value_4, table_value_6,
    table_description_1 &#0124; &#0124; table_description_2
    from edim_expense_element)
    SQL> /
    insert into bim_expense_element
    ERROR at line 1:
    ORA-00947: not enough values
    Please let me know what I did wrong. Thanks.
    Esther

    Hi Esther,
    You need to tell to Oracle what columns to fill in your insert like this:
    insert into bim_expense_element(bim_type_id, bim_code_desc, bim_table_value4, bim_table_value6, bim_description)
    select table_type_id, table_code_desc,
    table_value_4, table_value_6,
    table_description_1 | | table_description_2
    from edim_expense_element
    If the fields aren't of the same type you'll need to convert.
    null

  • ORA-01841 Error when value for date col is NULL in .dat (using SQL Loader)

    Hello Gurus,
    I have some data in .dat file which needs to be loaded into oracle table. I am using SQL * Loader to do the job. Although "NULLIF col_name =BLANKS" works for character datatype, but when value for date col is NULL then I get ORA-01841 error. I have to make NULL for all rows withour value for date column
    Early reply will be highly appreciated
    Farooq

    Hi,
    May be this problem is not with the NULLIF. The value for the date column is not in proper date format.
    create table:
    create table kk (empno number, ename varchar2(20), deptno number, hiredate date)
    Control file:
    LOAD DATA
    INFILE 'd:\kk\empdata.dat'
    insert into TABLE kk ( empno position (1:2) integer external,
    ename position(4:5) char NULLIF ename=BLANKS,
    deptno position (7:8) integer external NULLIF deptno=BLANKS,
    hiredate position (10:20) date NULLIF hiredate=BLANKS)
    data file:
    10 KK 01-jan-2005
    20 10
    SELECT * FROM KK;
    EMPNO ENAME DEPTNO HIREDATE
    10 KK 01-JAN-05
    20 10
    Verify the data file.
    Hope it will help

  • Problem while executing script in Toad - How to use '&' in the sql script ?

    I have to execute sql script in toad. Sql script has one insert query in which one insert-value is 'USA & CAN'. When I executed the script in toad by pressing F5, I got a prompt window asking for the value if 'CAN' as it is after the &.
    I tried using[b] {escape '\' } .... but could not resolve the problem.
    Is there any solution or workaround to overcome this problem. I have thousands of records with such values and I have to use sql script only.

    There is an option in TOAD to change this behaviour.
    Look in VIEW/OPTIONS/SQL Editor/
    Uncheck the box for "Scan statements for bound variables before execution".
    In SQL*PLUS it would be SET SCAN OFF
    (desupported version is SET DEFINE OFF)
    Message was edited by:
    Sven Weller

  • No Message: Write to Fact table.

    Hi ALL,
    Source: ECC 6
    Target: BI 7.3
    We are Transferring 2LIS_13_VDITM Datasource---->> 0SD_CO3 Infocube .
    After Data Replication ,
    1. Data Transferred to PSA .
    2. During Transformation Creation Manuel Mapping is performed . Activated .
    3. During DTP Creation Only Following Warning Messages Occur , Status s not Transferred to Green .
    Data is not coming Cube , No Error Messages. (Totally 29000 Records have to transfer to BI Cube)
    Warning Messages are,
    1.No Message: Write to Fact table.
    2.No Message:Infocube Update Completed .
    What is the Problem?

    Hi,
    Have you set the Industroy sector before uploading the set up tables?
    For more information refer the note: 353042
    Summary
    Symptom
    Fields BWGEO, BWGEOO, BWGVP, BWGVO, BWNETWR, BWMNG, etc. of DataSources 2LIS_02_SCL, 2LIS_02_ITM, 2LIS_03_BF, 2LIS_03_UM, 2LIS_40_REVAL are not filled.
    This may lead to the following:
    The system does not perform any update into an InfoCube (for example: 0RT_C*, 0PUR_C01, 0CP_PURC1 and so on), even though data arrives in BW.
    This occurs with the following InfoSources:
    2LIS_02_SCL, 2LIS_02_ITM
    2LIS_03_BF, 2LIS_03_UM
    2LIS_40_REVAL
    With some restriction, this symptom also occurs with the following InfoSources if they are used in connection with retail or consumer products. (InfoCube: 0RT_* or 0CP_* ).
    2LIS_11_VAITM, 2LIS_12_VCITM, 2LIS_13_VDITM
    Other terms
    0PROCESSKEY, PROCESSKEY, 0RT_C01, 0RT_C02, 0RT_C03, 0RT_C04, BWBRTWR, BWGEO, BWGEOO, BWGVP, BWGVO, BWNETWR, BWMNG
    Reason and Prerequisites
    The process key (0PROCESSKEY and 0BWAPPLNM) of the InfoSources has not been filled. As a result, no key figures are updated because of the update routine of the participating InfoCube and along with it no records are inserted into the InfoCube. In each update routine, the system checks the content of the PROCESSKEY. If this field has no contents, then no data is written into the InfoCube because of the IF condition in the update rules.
    Solution
    So that you can work in the above mentioned InfoSources, you MUST activate the determination of the process key. This is done with the help of Transaction MCB_ which you can find in the OLTP IMG for BW (Transaction SBIW) in your attached R/3 source system.
    Here you can choose your industry sector. 'Standard' and 'Consumer products' are for R/3 standard customers, whereas 'Retail' is intended for customers with R/3 Retail only.
    You can display the characteristics of the process key (R/3 field BWVORG, BW field 0PROCESSKEY) by using Transaction MCB0.
    If you have already set up historical data (for example for testing purposes) by using the setup transactions (Statistical Setup Programs) (for example: Purchasing: Tx OLI3BW, material movements: OLI1BW) into the provided setup tables (for example: MC02M_0SCLSETUP, MC03BF0SETUP), you unfortunately have to delete this data (Tx LBWG). After you have chosen the industry sector by using  MCB_, perform the setup again, so that the system fills a valid transaction key for each data record generated. Then load this data into your connected BW by using 'Full update' or 'Initialization of the delta process'. Check, whether the system updates data into the involved InfoCubes now.
    If all this is not successful, please see Note 315880, and set the application indicator 'BW' to active using Transaction 'BF11'.
    Related notes:
    157317 --> You MUST make sure that this note is relevant for you.
    352344 -> Process key + reversals in Inventory Management
              (Consulting note).
    Regards,
    Anil Kumar Sharma .P

Maybe you are looking for