Source table to data source mapping

Hi Guys,
I have table and field name in R/3 source system. Now I want to find out the names of the data sources that are having this field.
Is there way to look for it.
Example you know table rc29l and element datuv, now I want to know the name of datasources which has this field. Is there a way apart from checking it in help.sap.com infosources or checking it in indivual datasources of that ACH

Hi,
Thanks for response.
But I want to know this table has mapping for all business content or only activated business content.
Thanks

Similar Messages

  • Whar are logical table, Physical table Logical data Source

    Hi,
    Can any one explain me in details what Whar are logical table, Physical table Logical data Source.
    Any qucik help will be graetly appreciated

    In OBI there are three layers - Physical, Business Model and Mapping (BMM) and Presentation.
    As the name specifies the Physical layer mainly contain physical aspect of the application like which connection to use, which schema (also catalog in case of SQL server) to connect and also which table to use. This layer confirms the PK_FK joins for the related tables. This layer mainly depicts how the data has been stored in the database layer.
    On top of this layer you will have you BMM layer. The place where all work of a developer starts. You will structure the tables accourding to the business need. The structure has to be a STAR schema. All the entities in this layer are called logical because they do not directly represent any database object rather they provides a logical mapping to the database entities. This becomes clear when you use more that one Logical Table Source (LTS) for your logical tables. One logical column can map to N number of physical columns based on context. You can also create calculative columns in this layer which are totaly logical in nature.
    I am not writing anything on Presentation layer as it is not in you question. :)
    Hope this will help.
    Regards,
    Somnath

  • R/3 tables and data sources

    hi guys,
                         i am trying to find r/3 tables for plant maintainance data sources
    and project system data sources
    i am trying it out in help.sap.com
    i am able to find r/3 tables information for 60% of data sources only
    how can i find the exact information of r/3 tables with respect to data sources
    i have seen help link for this application components throughly
    please suggest me
    i will assign points

    Hi Selva,
    I typically try this:
    Goto ROOSOURCE -> Give the datasource name  and find out what type of datasource it is. If it is based on view/table then use the table/view. If it is based on function module then you would need to look at the code and find out from which tables the data is extracted.
    For fn module:Goto se37 -> Give the fn module name -> Click the "find" button -> give the text as "select". Select the radio button "Main program". It would list all the select statements in the fn module. In teh select statement you can find the name of the table.
    Hope this helps.
    Bye
    Dinesh

  • 'Select a measure:' stuck on 'Loading...' in Dashboard Designer KPI Dimensional Data Source Mapping

    [using SharePoint 2013 Enterprise SP1]
    I am trying to create a KPI in Dashboard Designer, but am getting a timeout. I have been doing this for a while on my site; this is not the first. I haven't had this problem before.
    I created a new KPI and clicked on the Data Mappings column value, which is a hyperlink, to bring up the Dimensional Data Source Mapping dialog. I switched to a Data Connection in the site I just created (DC works perfectly and can retrieve sample data).
    When I click the "Select a measure:" drop-down menu, I get the message "Loading..." and after a while (a minute? two?) a dialog pops up with:
    The request took too long to complete. SharePoint is currently unavailable or experiencing heavy traffic. Try again later.
    This is a test SP server and I'm the only one on it, so there is no load. Also as mentioned, I am able to verify the Data Connection without problem. I am not having any issue with any of my other few dozen KPIs/Data Connections. Any suggestions as to how
    to troubleshoot?

    Hi cgtyoder,
    According to your description, my understanding is that you got an error when you created a KPI in Dashboard Designer.
    Please try to recycle the PerformancePoint Services Application Pool account, compare the result.
    Please go to C:\inetpub\wwwroot\wss\VirtualDirectories\the port of web application, adjust the HttpRuntime executionTimeout for the Web Application by modifying the web.config, now PerformancePoint report stability is much better:
    <httpRuntime executionTimeout="600" maxRequestLength="51200" />
    Note: before you change the web.config file, please make a bakcup for the file.
    If this issue still exists, please go to the log file to find more information about this issue.
    I hope this helps.
    Thanks,
    Wendy
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Wendy Li
    TechNet Community Support

  • Period Source Mapping in FDMEE

    Hello,
    I am using a custom SQL DB as a source for myFDMEE and stuck in few query , wondering if anyone can suggest ?
    1- for period Mapping, do i need to have a calender (Period) pre-defined  in SQL ? if yes then how can i do that ?
    2-I have data columns (Jan-De) In source Table, so in the BefImport Script how should i insert values (Jan-Dec) into AIF Amount Columns ?  i can insert Jan column value into AMOUNT column of AIF open Interface Table when selecting "Period Mapping Type" as None .
    Please suggest if anyone have any idea . Much Appreciated !
    Thanks

    Hi,
    open interface adapter table does not have multiple columns for periods so data is stored in a similar way than FDMEE.
    there are two approaches:
    1) you only keep current period data in open interface table.
    2) you keep all data for all periods in the open interface table
    For 1)
    you can make your befimport script to import data from your custom sql table only for the POV period.
    you can get the period from fdmContext and use it as a filter in your SQL query to get the data
    with this option you can use None in period mapping type as you don't need to use source period mappings: you have current POV period data in your open interface table
    For 2)
    this is typically used when the open interface table is populated from an external source where they load data for all periods into the table. THen you need to extract only your current pov period data.
    For that you can use Source Period Mappings.
    With this option you don't need any befimport script as the interface table is populated already.
    If you want to insert data for all periods into the open interface table, you need to PIVOT the table so:
    Acc1;1;2;3;4;5;6;7;8;9;10;11;12 where you have 12  values (example 1 to 12) is converted into:
    Acc1;Jan;1
    Acc1;Feb:2
    Acc1;Mar;3....
    then you can use source period mappings to make the open interface adapter import only the data for the periods you set in the data load rule execution (either the pov period or a range of periods)
    i hope that clarifies.

  • Info source mapping

    Hi,
    I have activated the 2LIS_11_VASTH info source and found that the these fields are not mapped.
    Confirmation Status               -     BESTK
    Billing Status ( order-related billing documents)    -     FKSAK
    Billing Status               -     FKSTK
    Overall processing status of documents       -     GBSTK
    please help.
    Thanks,

    Hi Kim,
    You can map GBSTK with InfoObject 0BVTSPRSTAT.
    You can check in table RSOSFIELDMAPSH for the mapping of fields.
    The other fields doesn't have InfoObjects mapped.
    Check this:
    0SPL_CFSTAT for Confirmation  Status
    Hope this helps.
    regards,
    Diego

  • Mapping of PLSQL  table type  Date to java

    i am having problem in mapping plsql table type DATE in java,
    able to execute procedures which return plsql table type NUMBER,VARCHAR.
    i am using oracle 9 , jdk1.4, oci driver, windows 2000.
    sample code:
    registering:
    st.registerIndexTableOutParameter(15,100,OracleTypes.DATE,1000);
    st.registerIndexTableOutParameter(16,100,OracleTypes.DATE,1000);
    st.execute();
    getting out params in arrays:
    java.sql.Date[] O_lSubFolder_CrOn=(java.sql.Date[])java.sql.Date[] st.getPlsqlIndexTable(15);
    O_lSubFolder_MdOn=(java.sql.Date[])st.getPlsqlIndexTable(16);
    error while executing the code:
    java.sql.SQLException: Invalid PL/SQL Index Table element type
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:180)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:222)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:285)
    at oracle.jdbc.driver.OraclePreparedStatement.checkPlsqlIndexTableBindTypes(OraclePreparedSt
    atement.java:2705)
    at oracle.jdbc.driver.OracleCallableStatement.registerIndexTableOutParameter(OracleCallableS
    tatement.java:834)
    can anyone help me to solve this problem.

    i am having problem in mapping plsql table type
    DATE in java,
    able to execute procedures which return plsql table
    type NUMBER,VARCHAR.
    i am using oracle 9 , jdk1.4, oci driver, windows
    2000.
    sample code:
    registering:
    st.registerIndexTableOutParameter(15,100,OracleTypes.D
    TE,1000);
    st.registerIndexTableOutParameter(16,100,OracleTypes.D
    TE,1000);
    st.execute();
    getting out params in arrays:
    java.sql.Date[]
    O_lSubFolder_CrOn=(java.sql.Date[])java.sql.Date[]
    st.getPlsqlIndexTable(15);
    O_lSubFolder_MdOn=(java.sql.Date[])st.getPlsqlIndexTab
    e(16);
    can anyone help me to solve this problem.1. Write a wrapper procedure that converts the table of dates to either number or date and then re-convert the table back into date.
    2. Since it's an out param you could create a temp table, insert the contents of the index by array into it and return a cursor.
    3. Create a oracle type using CREATE TYPE and then use an array of the type.
    David Rolfe
    Orinda Software

  • Split Source Mapping Generation Task

    Hi
    I created a Dynamic Web Project in IBM Rational Application Developer Version 8. I am using BEA Weblogic 10.0.2 as the Application Server. When creating Dynamic web project, I added project to an EAR and named it testEAR. The project compiles without any errors and EAR is generated. When I try to add it to my server and deploy, I get the following error.
    Runtime exception occurred in publish task 'Split Source Mapping Generation Task'.
    Path must include project and resource name: /testEAR
    I checked web.xml for duplicate entries and didnt find anything. Below are my XML files for Deployment Assembley. The first is for the testEAR project
    <?xml version="1.0" encoding="UTF-8"?><project-modules id="moduleCoreId" project-version="1.5.0">
    <wb-module deploy-name="testEAR">
    <wb-resource deploy-path="/" source-path="/"/>
    <dependent-module archiveName="XYZ.war" deploy-path="/" handle="module:/resource/XYZ/XYZ">
    <dependent-object>WebModule_1308854753228</dependent-object>
    <dependency-type>uses</dependency-type>
    </dependent-module>
    </wb-module>
    </project-modules>
    The second is for the Dynamic Web Project
    <?xml version="1.0" encoding="UTF-8"?><project-modules id="moduleCoreId" project-version="1.5.0">
    <wb-module deploy-name="XYZ">
    <wb-resource deploy-path="/" source-path="/WebContent"/>
    <wb-resource deploy-path="/WEB-INF/classes" source-path="/src"/>
    <property name="context-root" value="XYZ"/>
    <property name="java-output-path" value="/XYZ/WebContent/WEB-INF/classes"/>
    </wb-module>
    </project-modules>
    Please let me know how to resolve this problem.

    Hi,
    This failure occurred probably because a Content Directory was not specified for the EAR project.
    If a Content directory is not specified, then ".settings" and ".project" files will be part of the source path for the bea compiler.
    As per the requirement, it is very important that WLS build scripts parse/compile and copy all the contents of the source folder.
    So, as per the process the ".settings" directory and the ".project" directory are also copied into the temporary build directory.
    This causes the failure during deployment as ".settings" and ".project" components can't be deployed.
    The Content directory is a requirement only in WLS environment, due to the build logic of WLS.
    By default, the WLS Eclipse tasks create a Content directory for a EAR project. So, this is not documented.
    But, when you import an existing EAR directory, this limitation is exposed.
    Creating the EAR Content folder should be the best solution.
    Steps to modify imported applications:
    1. Create a directory for the EAR content (for APP-INF, META-INF and all other modules or files that are part of the EAR) and copy all the files (except ".settings" and ".project") into the EAR content directory. (This has to be performed in windows explorer)
    2. Then edit the file .settings/org.eclipse.wst.common.component EAR content directory reference for deploy-path:
    For eg:
    <wb-resource deploy-path="/" source-path="/"/>
    to
    <wb-resource deploy-path="/" source-path="/EarContent"/>
    3. Refresh the eclipse project.
    4. Delete the temporary project folder that is created in the below location:
    <Workspace Home>\.metadata\.plugins\org.eclipse.core.resources\.projects\
    For eg:
    cd D:\Cases\EclipseWorkspace\.metadata\.plugins\org.eclipse.core.resources\.projects\
    del WebTestEAR\
    ** This is a temporary folder which will be created by WebLogic build scripts during
    deployment. **
    5. Now, try to deploy the EAR project.
    Hope it helps.

  • Import of Main and Lookup table in a single Map

    Hey Guys,
    I am developing a Proof of Concept to Import Main and Lookup-Flat in a single Import Map (by using a single excel file).
    Below is my Table structure:
    Main Table: Customer
    --->Customer_Number (Text,Unique Field, Display Field)
    --->Sales_Area (Lookup Flat)
    Lookup Table: Sales_Area
    Sales_Area_ID (Text,Unique Field,Display Field)
    Sales_Area_Desc (Text,Display Field).
    The import File (excel) has below attributes:
    Customer_Number,Sales_Area_ID,Sales_Area_Desc.
    When i start both Main Table and Lookup tables are empty (there is no data in Data Manager for either of the tables)
    Now in the Import Map, i selected source as the excel file and target as the Main table.
    I did mapping of Customer_Number as usual after that I created a compound field for Sales_Area_ID+ Sales_Area_Desc and did the mapping of this compound field. then did the mapping for Sales_Area_ID and Sales_Area_Desc.
    Now since there is no data in lookup table, i select the "Add" button in the "Value Mapping" section. when i execute this map, it works perfectly and data is loaded in both the Main table and lookup table but if a new value comes in the excel(a value which does not yet exist in lookup table), the map fails, when i open it , it says that i need to redo the Value Mapping, again i click on "Add" button and it starts working. So basically the Import Map fails whenever i get a value in excel which does not yet exist in Lookup table.
    Now my question is, is there a way to automate my import map, i thought clicking on "Add" button will take care of all the lookup values which are not already present.
    Can anyone please help me in this regard.
    Thanks
    Saif

    Hi Saif,
    You can try the following option.
    Right click on the lookup filed/compound filed in destination fields, and select the option 'SET MDIS Unmapped Value Handling' as 'ADD'.
    Cheers,
    Cherry.

  • Auto-creating tables with CMP Explicit Mapping

    Hello.
    I am trying to migrate a working CMP EJB Jar from Weblogic to OC4J, using the admin.jar deploy utility. I believe my orion-ejb-jar.xml deployment descriptor (shown at the bottom) is set up properly, but OC4J attempts to auto-create tables for my entity beans anyway. (see messages that follow).
    Can anybody tell me why it tries to create these tables? Is something missing from my deployment descriptor?
    Thanks,
    David
    Auto-creating table: create table User_passwords_Passwor__fqeml6 (User_SEC_USER.
    SEC_USER_ID NUMBER not null, Password_SEC_USER_PASSWORD_ID NUMBER null)
    Error creating table: ORA-01748: only simple column names allowed here
    Auto-creating table: create table User_userRoles_UserRol__guatbv (User_SEC_USER.
    SEC_USER_ID NUMBER not null, UserRole_SEC_USER_ROLE_ID NUMBER null)
    Error creating table: ORA-01748: only simple column names allowed here
    Auto-creating table: create table Permission_RolePermiss__768cc0 (Permission_SEC
    FUNCTIONID NUMBER not null, RolePermission_SEC_ROLE_FUNCTION_ID NUMBER null)
    Error creating table: ORA-00972: identifier is too long
    Auto-creating table: create table Role_UserRole_role_User_39lzhp (Role_SEC_ROLE.
    SEC_ROLE_ID NUMBER not null, UserRole_SEC_USER_ROLE_ID NUMBER null)
    Error creating table: ORA-01748: only simple column names allowed here
    Auto-creating table: create table Role_rolePermissions_Ro_4ocmz1 (Role_SEC_ROLE.
    SEC_ROLE_ID NUMBER not null, RolePermission_SEC_ROLE_FUNCTION_ID NUMBER null)
    Error creating table: ORA-01748: only simple column names allowed here
    orion-ejb-jar.xml
    <?xml version="1.0"?>
    <!DOCTYPE orion-ejb-jar PUBLIC "-//Evermind//DTD Enterprise JavaBeans 1.1 runtime//EN" "http://xmlns.oracle.com/ias/dtds/orion-ejb-jar.dtd">
    <orion-ejb-jar deployment-version="9.0.3.0.0" deployment-time="e6e55808e4">
    <enterprise-beans>
    <session-deployment name="SecurityComponent" >
    </session-deployment>
    <session-deployment name="ServerSecurityContext" >
    </session-deployment>
    <session-deployment name="SecurityComponentSecurityFilter" >
    </session-deployment>
    <entity-deployment name="User" table="SEC_USER" data-source="datasource-ecmsOraclePool">
    <primkey-mapping>
    <cmp-field-mapping name="userId" persistence-name="SEC_USER.SEC_USER_ID" />
    </primkey-mapping>
    <cmp-field-mapping name="partyId" persistence-name="PARTY_ID" />
    <cmp-field-mapping name="officeLocation" persistence-name="OFFICE_LOCATION" />
    <cmp-field-mapping name="defaultLocation" persistence-name="DEFAULT_LOCATION" />
    <cmp-field-mapping name="districtVvid" persistence-name="DISTRICT_VVID" />
    <cmp-field-mapping name="userName" persistence-name="SEC_USER.USER_NAME" />
    <cmp-field-mapping name="accountActivationDate" persistence-name="SEC_USER.ACCOUNT_ACTIVATION_DATE" />
    <cmp-field-mapping name="defaultBranchVvid" persistence-name="SEC_USER.DEFAULT_BRANCH_VVID" />
    <cmp-field-mapping name="unitVvid" persistence-name="UNIT_VVID" />
    <cmp-field-mapping name="sectionVvid" persistence-name="SEC_USER.SECTION_VVID" />
    <cmp-field-mapping name="lastUpdateTimestamp" persistence-name="AUD_SYSTEM_OBJECT.LAST_UPDATE_DT" />
    <cmp-field-mapping name="lastUpdateSystemUserId" persistence-name="AUD_SYSTEM_OBJECT.LAST_UPDATE_SEC_USER_ID" />
    <cmp-field-mapping name="failedLoginAttempts" persistence-name="SEC_USER.LOGIN_ATTEMPTS_CNT" />
    </entity-deployment>
    <entity-deployment name="Permission" table="SEC_FUNCTION" data-source="datasource-ecmsOraclePool">
    <primkey-mapping>
    <cmp-field-mapping name="permisssionId" persistence-name="SEC_FUNCTION_ID" />
    </primkey-mapping>
    <cmp-field-mapping name="permission" persistence-name="FUNCTION_NAME" />
    </entity-deployment>
    <entity-deployment name="Role" table="SEC_ROLE" data-source="datasource-ecmsOraclePool">
    <primkey-mapping>
    <cmp-field-mapping name="roleId" persistence-name="SEC_ROLE.SEC_ROLE_ID" />
    </primkey-mapping>
    <cmp-field-mapping name="lastUpdateTimestamp" persistence-name="AUD_SYSTEM_OBJECT.LAST_UPDATE_DT" />
    <cmp-field-mapping name="lastUpdateSystemUserId" persistence-name="AUD_SYSTEM_OBJECT.LAST_UPDATE_SEC_USER_ID" />
    <cmp-field-mapping name="roleDescription" persistence-name="SEC_ROLE.ROLE_DESC" />
    <cmp-field-mapping name="roleName" persistence-name="SEC_ROLE.ROLE_NAME" />
    </entity-deployment>
    <entity-deployment name="Password" table="SEC_USER_PASSWORD" data-source="datasource-ecmsOraclePool">
    <primkey-mapping>
    <cmp-field-mapping name="passwordId" persistence-name="SEC_USER_PASSWORD_ID" />
    </primkey-mapping>
    <cmp-field-mapping name="password" persistence-name="PASSWORD" />
    <cmp-field-mapping name="activeInd" persistence-name="ACTIVE_IND" />
    <cmp-field-mapping name="expirationDate" persistence-name="EXPIRATION_DATE" />
    <cmp-field-mapping name="userId" persistence-name="SEC_USER_ID" />
    <cmp-field-mapping name="lastUpdateTimestamp" persistence-name="AUD_SYSTEM_OBJECT.LAST_UPDATE_DT" />
    <cmp-field-mapping name="lastUpdateSystemUserId" persistence-name="AUD_SYSTEM_OBJECT.LAST_UPDATE_SEC_USER_ID" />
    </entity-deployment>
    <entity-deployment name="UserRole" table="SEC_USER_ROLE" data-source="datasource-ecmsOraclePool">
    <primkey-mapping>
    <cmp-field-mapping name="userRoleId" persistence-name="SEC_USER_ROLE_ID" />
    </primkey-mapping>
    <cmp-field-mapping name="sequenceNum" persistence-name="SEQUENCE_NUM" />
    <cmp-field-mapping name="userId" persistence-name="SEC_USER_ID" />
    <cmp-field-mapping name="roleId" persistence-name="SEC_ROLE_ID" />
    </entity-deployment>
    <entity-deployment name="RolePermission" table="SEC_ROLE_FUNCTION" data-source="datasource-ecmsOraclePool">
    <primkey-mapping>
    <cmp-field-mapping name="roleFunctionId" persistence-name="SEC_ROLE_FUNCTION_ID" />
    </primkey-mapping>
    <cmp-field-mapping name="functionId" persistence-name="SEC_FUNCTION_ID" />
    <cmp-field-mapping name="roleId" persistence-name="SEC_ROLE_ID" />
    <cmp-field-mapping name="sequenceNum" persistence-name="SEQUENCE_NUM" />
    </entity-deployment>
    </enterprise-beans>
    <assembly-descriptor>
    </assembly-descriptor>
    </orion-ejb-jar>

    Hi David,
    The "autocreate-tables" attribute is defined in the DTD for the "orion-application.xml" file. This file is usually generated automatically by OC4J, from your application's "application.xml" file when you deploy your application. (Note that subsequent [re-]deployments of your application will not change the "orion-application.xml" file.)
    The default value for this attribute is 'true', and the default is defined in the "application.xml" file that is located in the "j2ee/home/config" subdirectory of the OC4J installation.
    Hope this answers your question.
    Good Luck,
    Avi.

  • Technical Design Review Question: Data target Mapping and transformation

    I got my hands on technical design documentation for a project on COPA budget. I came up with a few questions but I will post them separately for fast closing and awards:
    In the discussions of Data target Mapping and transformation, there was a table of characteristics, showing dimensions, BW filed, Source field, data type, etc.
    1. What is the technique in deciding which characteristics get grouped together into a particular dimension?
    2. Why do some dimensions only have one characteristic and what is its significance?
    3. I saw one BW field, OVAL_TYPE (description= valuation type) included in three dimensions: Customer, Material and Valuation Type(only field=OVAL_TYPE). What is the significance of this repetition?
    Thanks

    Morning,
    To define Dimension means, to group all characters together, which do have a relationship "1 to n" and not "n to m" to reduce data volume (cardinality). If you reduce cardinality (which means to define 1:m relationship whenever it is possible and group them within one common dimension) you increase performance (make performance better) because of less data volume. It is essential to understand that this can not be changed easily after definition in PROD. So data modeling is also from this point of view very important.
    Example:
    An accounting object has a "m:n" relation ship to the accounting partner object. that's the reason way accounting object and accounting partner object do not belong to the same dimension.
    Ni hao
    Eckhard Lewin

  • Trying to populate a table with data from WebRowset

    Hi,
    I want to be able to populate my tables with data from WebRowsets that have been saved to files. Everything goes good until I get to acceptChanges(). At which point I get a NullPointerException.
    Here's the code...
    WebRowSet wrs = new WebRowSetImpl();
    FileReader reader = new FileReader(inputFile);
    wrs.readXml(reader);
    wrs.beforeFirst();
    CachedRowSet crs = new CachedRowSetImpl();
    crs.setSyncProvider("com.sun.rowset.providers.RIXMLProvider");
    crs.populate(wrs);
    crs.beforeFirst();
    crs.acceptChanges(con);
    Results in...
    java.lang.NullPointerException
    at com.sun.rowset.CachedRowSetImpl.acceptChanges(CachedRowSetImpl.java:867)
    at com.sun.rowset.CachedRowSetImpl.acceptChanges(CachedRowSetImpl.java:919)
    I'm using Java 1.5_02. I looked at the source code for CachedRowSetImpl, and the only thing I could think of is that maybe "provider.getRowSetWriter()" in the following snippet is returning null....
    public void setSyncProvider(String s)
    throws SQLException
    provider = SyncFactory.getInstance(s);
    rowSetReader = provider.getRowSetReader();
    rowSetWriter = (TransactionalWriter)provider.getRowSetWriter();
    Any ideas?? Thanks!

    I have the same problem after setting com.sun.rowset.providers.RIXMLProvider.
    Looks like a bug to me.
    By the way, why are you creating a new CachedRowSet and populate it with a WebRowset (which extends CachedRowSet)?

  • "member of " not working with nested table of dates

    Can someone explain why this doesn't work. The last select does not return any row. When trying the same with a table of number, it works fine.
    drop table test;
    drop type date_tab;
    create type date_tab as table of date;
    create table test(dates date_tab) nested table dates store as dates;
    insert into test values (date_tab(to_date('10-jan-2007'),to_date('15-jan-2007'),to_date('15-jan-2007')));
    commit;
    select * from test where to_date('10-jan-2007') member of dates;
    Line above should find the row, but does not return anything.

    > With 10G Oracle said that these two engines are now sharing the same source code
    Is that documented somewhere? Regarding database versions, it was 9.0.1 that claimed an integrated parser. I don't see any update for 10g.
    > So in the old days one had to do a [SELECT sysdate INTO d FROM dual] to assign a function value to a PL/SQL var - as the function did not exist in PL/SQL.
    Then (from Oracle 7 onwards?) these functions were also available in PL/SQL. However, the two engines did not share common code. So functions in SQL did not always behave like function in PL/SQL and vice versa.
    I don't recall that limitation in PL/SQL v1 (Forms 3 to 4.5, and database v6, though I doubt many people actually used PL/SQL on the database because (1) it was separately licensed, and (2) it didn't have stored procedures) - but then it was a while ago so I could be mistaken.
    I know USER, SYSDATE and others used to be implicit queries of dual (i.e. the supplied PL/SQL function was just a wrapper containing SELECT SYSDATE INTO dt FROM dual; RETURN dt;) although that's probably just confusing the issue.

  • How to transport/move a table with data from development to Test to Production

    Hi,
    How to transport/move a table with data from development to Test to Production..? Export-Import a Delivery Unit does only the structure and not the data
    Reg
    Sri

    Hi Sri,
    You cannot transport Data via Transport route in HANA, you can only transport code changes/Structure via DU. For Data movement, you either have a do a export/import from a flat file or replication from a Source System to HANA.
    Thanks Much,
    Abhishek

  • Sorting table by date doesn't work properly

    what am I doing wrong?
    I've made a table with 5 columns and many rows, with one column being the date.
    I need to sort the entire table by date, but it doesn't seem to be very accurate - sorted items appear out of date order
    suggestions?
    pete

    Hello
    I know that.
    But if you receive a foreign document (ie not a Pages one) embedding dates stored as mm/dd/yyyy or dd/mm/yyyy, your app will not recognize them as dates.
    I get this behavior every day.
    I repeat, not with iWork '09 documents because in these ones, dates are stored as numbers (number of seconds since 01/01/2001) and displayed matching your system settings.
    If you receive a CSV or a TSV file, your machine will not recognize the embedded dates because most of these formats's users ignore what is the ISO format.
    Back to the date storage.
    At first, only column B was filled. The string "Yvan" was just used to help me to reach this storage area in the index.xml file. Here is a screenshot of this area.
    Small arrows are pointing the numbers describing the dates.
    In the completed table, in column C I copied the values stored in the index.
    The cell D2 contain the formula :
    =-DATEDIF(B,$B$3,"D")
    I was forced to use the reverse of others because DATEDIF refuse a start date greater than an end one.
    In D3, the formula is :
    =DATEDIF($B$3,B,"D")
    then apply fill down
    They calculate the count of days.
    In E2, the formula is :
    =D*24*60*60
    apply fill down
    It calculate the number of seconds corresponding to the number of days and of course, it matches what is stored in the index.xml file.
    Given that, opening in every location an iWork '9 document created everywhere, we will get correct dates whichever format is used for creation or for reading.
    Theoretically, it would work the same if you import a M…Soft document (they use also numbers to store dates).
    I can't check for iWork '08 documents because I don't have old foreign ones in good health. Those which I kept are corrupted ones in which I look from time to time trying to find where is the wrongdoer.
    When the source file is a CSV or a TSV one, the apps have no way to decide which is the format used.
    Of course, if the third component has four digits, it's a year.
    If a first two digits component is greater than 12, it's a day
    If a second two digits component is greater than 12, it's a day
    but that's all.
    No way to know for sure what is 5/6/2010
    As you use the ISO format, for you it will be a string
    For an US user, it will be 6 may 2010
    For a French user, it will be 5 juin 2010
    If the value is the well known  31/12/1943
    for you and for US users, it will be a string
    Happily, for me, it's 31 décembre 1943.
    As you often wrote, it would be easier if everybody was using the ISO format but I'm not sure that the grand children of my grand son will use it.
    Yvan KOENIG (VALLAURIS, France) samedi 2 juillet 2011 19:01:15 iMac 21”5, i7, 2.8 GHz, 4 Gbytes, 1 Tbytes, mac OS X 10.6.8
    Please : Search for questions similar to your own before submitting them to the community
    To be the AW6 successor, iWork MUST integrate a TRUE DB, not a list organizer !

  • Populate SQL table with data from Oracle DB in ODI

    Hi,
    I am trying to populate a source SQL table with fields from an Oracle db in ODI. I am trying to perform this using a procedure and I am am getting the following error:
    ODI-1226: Step PROC_1_Contract_Sls_Person_Lookup fails after 1 attempt(s).
    ODI-1232: Procedure PROC_1_Contract_Sls_Person_Lookup execution fails.
    ODI-1228: Task PROC_1_Contract_Sls_Person_Lookup (Procedure) fails on the target MICROSOFT_SQL_SERVER connection Phys_HypCMSDatamart.
    Caused By: weblogic.jdbc.sqlserverbase.ddc: [FMWGEN][SQLServer JDBC Driver][SQLServer]Invalid object name 'C2C_APP.CON_V'.
    My question is what is the best method to populate SQL db with data from an Oracle db? Using a procedure? A specific LKM?
    I found threads referring to using an LKM to populate Oracle tables with data from a SQL table....but nothing for the opposite.
    Any information would help.
    thanks,
    Eric

    Hi Eric,
    If using an Interface, I would recommend the LKM SQL to MSSQL (BULK) knowledge module. This will unload the data from Oracle into a file, then bulk load the staging db on the target using a BULK INSERT.
    Regards,
    Michael Rainey

Maybe you are looking for

  • Wmp12 not displaying correctly on yoga 2 running 8.1

    screen res on high. not using special access settings. not zooming.

  • CMP Entity EJBeans  with jboss 4

    I m using jboss as an application server and i created a CMP Entity bean but when i search using findbyprimary key method of the bean, it never shows the records that i inserted in the DB manually. If i create the bean using create method, it success

  • Unwanted Internet Explorer (11) Addons - Cannot disable or remove

    Hi folks - I have two IE add-ons that were installed without my knowledge (Apps-Hat and HDTotal1 - in case it matters) and I cannot disable or remove them (the options to do these are greyed out). Additionally, the folder and the dll files supposedly

  • I want to post a movie on google video HELP!!!

    the Google Video Uploader won't let me put any movies into it im stuck someone help me please!

  • Very Slow Copying Files

    I have My External Hard Drive connected to my air port extreme base station. I have the latest firmware update. While copying files or using time machine to back up my computer, the speed is extremely slow max speed is 2.2 mpbs but bounces from 200kb