Data import for CAF generated tables

Hello,
I developed a CAF application with custom Web Dynpro UI. As the persistency up to now just consists of "Local persistency", all the data is stored in CAF tables on my NW2004s system. I know that I can create records "by hand" in the Service Browser. But it's time consuming and I have more than 10.000 records.
How can I import data to these tables? Are there any upload functions for csv/xls or any other file types?
Thanks and regards
Joschi

hi all,
As u all discussed about importing data into sql tables form CAF database,i understud the concept.
can you please provide me step by step approach or any document to do this practically.
im new to CAF development.I want to practice a scenario on CAF Remote persistency,for that
i want to utilise sql tables into CAF entity services and develop some functionality(Using CRUD) methods.
i konw that to utilse sql tables,call those tables into java application and expose it as webservice and using caf external service we have to capture that webservice.
is that correct approach..
can anyone explain the clear procedure..(or any document)
how can i achieve remote persistency of caf in  case of sql server.
Thanks & Regards
sowmya.

Similar Messages

  • Creation of a generic extractor and data source for the FAGLFLEXA table

    Hi All,
    Need to create a generic extractor and data source for the FAGLFLEXA table to support AR reporting. This table contains the necessary profit center information to perform LOB reporting against the AR data.
    Please advice on how to do this.
    Regards, Vishal

    Hi Vishal,
    Its seems a simple a work out.
    1.Go to RSO2 & choose the relevant option ie. whether you want to create Transactional DS, Master Data DS or Text DS.
    2. Name it accordingly & then create.
    3. Give description to it & then give table name FAGLFLEXA.
    4. Save it & activate. If you need it to be delta enabled then click over Delta & you can choose accordingly.
    If you still face some problem then do mail me at [email protected]
    Assign points if helpful
    Regards,
    Himanshu

  • Data import for users of forms created with Livecycle Designer

    Hello,
    I have seen several posts regarding data import for forms created by Livecycle Designer but nothing that helps with something I am trying to accomplish.  I can create a data connection and import information in a form but what I would like to do is import data, then send the pdf for completion to a user.  There are a few data elements that I have available and the rest of the information comes from from the user.  The problem I run into is once I create a data connection, the pdf is ALWAYS looking for the source file for that data.  I simply want to prepopulate some fields and send to the various users for completion.  Any help would be greatly appreciated.
    Thanks!

    Which type of Data Connection are you trying to create?
    XML Schema, Sample Data File or WSDL?
    Creating any one of first two types(mentioned above) will only create schema and will never import any data into PDF.
    If you create the WSDL connection, you can surely import data (i.e. prepopulate data) into your PDF and forward it for users review/fill.
    If I misunderstood your question, please get me clarified.
    Nith

  • How to find unsued Transfer Rules and Data Sources for a Master Table...??

    How to find unsued Transfer Rules and Data Sources for a Master Table...?? My requirement is i need to delete those Transfer rules and Data Sources which are not in use

    Hi
    Go to manage of the Text or attirbute of the master data object, see what are being loaded daily from there and delete the remaining.
    Cheer
    Ans as expected, Please reward

  • Data entry for Z... table

    I created a table with 3 fields and want to maintain its data. I know that use SE54 and then SM30 to do that, but I got this error when I tried to maintain data in SM30.
    View/table ZREGION can only be displayed and maintained with restrictions
    Message no. SV792
    Diagnosis
    You tried to call a maintenance dialog for the view/table ZREGION, for which the table maintenance is only allowed with restrictions. You can only edit the data in the environment of another program or a view cluster.
    Could anybody give me a step by step solution for a user table? Any replies are greatly appreciated!

    Hi,
    I think while generating the table maint, you have chosen the option.
    Just relook at the Table maint, and change this property.
    That should fix the problem.
    Regards,
    Ravi
    Note : Please mark the helpful answers

  • Switch data sources for only 1 table?

    Greetings. In AS 2012 Tabular, I see that I have 3 existing connections. One connection is used as the source for most of the tables, but the other two connections are the source for one table each. I'd like to change the source for those two other
    tables to use the main data connection, but can't figure out how?
    My logical guess would be in the table properties, but that's not it.
    Any ideas?
    TIA, ChrisRDBA

    Greetings. In AS 2012 Tabular, I see that I have 3 existing connections. One connection is used as the source for most of the tables, but the other two connections are the source for one table each. I'd like to change the source for those two other
    tables to use the main data connection, but can't figure out how?
    My logical guess would be in the table properties, but that's not it.
    Any ideas?
    TIA, ChrisRDBA
    Hello,
    If your three data connections from different database, I don't think we can combine three data connections into one in Tabular model design surface. If you need to only use one data connection for your Tabular model, please try to combine your underlying
    tables into one database.
    Please point out if I have something misunderstood.
    Regards,
    Elvis Long
    TechNet Community Support
    Thanks Elvis. As stated above, all tables reside in one DB.
    TIA, ChrisRDBA

  • Issue in data replication for one particular table

    Hi,
    We have implemented streams in out test environment and testing the business functionalities. We have an issue in data replication for only one custom table all other tables data replications are proper no issue. When we do 100 rows update data replication is not happening for that particular table.
    Issue to simulate
    Update one row -- Replication successful.
    100 rows update -- After 3-4 hrs nothing happened.
    Please let me know did any of you have come across similar issue.
    Thanks,
    Anand

    Extreme slowness on apply site are usually due to lock, library cache locks or too big segments in streams technical tables left after a failure during heavy insert. these tables are scanned with full table scan and scanning hundreds of time empty millions of empty blocks result in a very big loss of performance, but not in the extend your are describing. In your case it sound more like a form of lock.
    We need more info on this table : Lob segments? tablespace in ASSM?
    If the table is partitioned and do you have a job that perform drop partitions? most interesting is what are the system waits nd above all the apply server sessions waits. given the time frame, I would more looks after a lock or an library cache lock due to a drop partitions or concurrent updates. When you are performing the update, you may query 'DBA_DDL_LOCKS', 'DBA_KGLLOCK' and 'DBA_LOCK_INTERNAL' to check that you are not taken in a library cache lock.

  • Iptc and xmp, data important for organizing photos?

    HI, I keep encountering these two acronyms (IPTC and XMP) while working with my photos in Aperture.  Are these data important, should I be doing anything with it?  Essentially what I am doing is, after importing photos from iPhoto, I am filing to Folders, Projects, and Albums (some Smart). Thanks

    Astechman,
    EXIF (which you didn't ask about) is Exchangable Interchange Format, and those fields are generally physical attributes about your photos, as recorded by your camera.  Things like date/time, aperture, shutter speed, etc.  Some people like to think they should be able to change those, but that doesn't make any sense (except if your camera's clock is wrong.)
    IPTC (International Press Telecommunications Council) data is used in digital media as metadata for the author to put things.  These include things like keywords, location narratives, copyright notice, photographer name.  I.e., things that don't have anything to do with the camera or the physical attributes of the photo, but about the subject/content of the photo or the photographer.
    XMP adds onto IPTC, and is often associated with a "sidecar" file, in which a digital asset manager system (DAMS) saves extra metadata to such a file.  (Aperture is a DAMS, but it does not use sidecar files; it keeps data like that within the library.)
    As for what you should be doing with it -- that's up to you.  How much metadata do you want associated with your photos?  Fill in those fields and just ignore the rest.  I tend to fill in keywords and copyright, and that's about it, but there are many other fields that may be of interest to you.
    nathan

  • Help needed: Data source for Facilities management (tables VTB_ASGN*)?

    Hi all,
    I am looking for a data source for facilities management in R/3 (PI 2004_1_470)
    The tables are concerning VTB_ASGN* etc.
    every feedback is highly appreciated and rewarded.
    Thanks and regards,
    Sally

    Hello Sally,
    Did you ever find a solution to this issue?
    My client is also looking to do reporting on facilities in BW and integrate this data into liquidity planning in SEM-BPS. However, it looks like this will have to be a custom extractor calling one of the two function modules either FTR_FC_GET_COMMITMENT or FTR_FC_GET_COMM_DRAW_FEE.
    Gregory

  • Import for just one table fails....

    Hi,
    On production server i have declared a db user , let's call him PRD_USER, with default tablespace PRD_TBL.
    On development server i have declared a db user , let's call him PRD_USER, with default tablespace DEV_TBL.
    On production server, i use the exp db utility in order to do the import as:
    imp system/manager from user=PRD_USER touser=PRD_USER ignore=Y file_name ='....' log='....'.
    The import succeeds for about 25 tables and indexes and constraints but it fails for just one table with error : {i do not remember the ORA- error, and i have not access to it right now} tablespace DEV_TBL does not exist.
    Of course this tablespace does not exist on production env. but how does this problem arise since the default tablespace for this user is not DEV_TBL but PRD_TBL...????
    Do you have the slightest idea what may be the cause and how can i overcome this problem during import...???{Note: I gave a temporary solution... taking the sql creation script of the table leaving out the reference of tablespace 'DEV_TBL'}.
    Both servers run exactly the same version of DB..
    Note : I use DB 10g v.2
    Thank you,
    Sim

    HI,
    "If the table has Partitions, the import is attempting to create the Partitions (in the CREATE TABLE statement) on the original tablespace."
    The table is not partitioned.....
    I'll check for the second characteristic...(LOB-column on Monday)
    Thank you,
    Sim

  • BUG: Export DDL and Data fails for mixed case table/column names

    Hi there,
    I have found a bug in SQL Developer. See details below.
    Description:
    When "Export DDL and Data) function is used on a table/columns not named in UPPERCASE, sql generated by SQL Developer is invalid.
    Steps to reproduce:
    - open SQL Developer, connect to DB
    - make a table named "lowerCase" (in double quotes, so it won't be automatically changed to capital letters)
    - you may also add some columns, for example "lowerCol1", "UpCol2", ALLUPCOL3
    - add some data rows to the table
    - choose Tools -> Export DDL and Data
    - check exporting of tables and data, on "filter" tabs choose your "lowerCase" table
    - press "Apply"
    Error:
    Generated SQL contains invalid INSERTs: mixed-case table and columns are referenced without obligatory double quotes, which yields an error when generated script is executed (see below, relevant line is underlined)
    -- DDL for Table lowerCase
    CREATE TABLE "DBO_HT"."lowerCase"
    (     "lowerCol1" VARCHAR2(100),
         "UpCol2" VARCHAR2(100),
         "ALLUPCOL3" VARCHAR2(100)
    -- DATA FOR TABLE lowerCase
    -- FILTER = none used
    -- INSERTING into lowerCase
    Insert into lowerCase (lowerCol1,UpCol2,ALLUPCOL3) values ('lc','uc','auc');
    -- END DATA FOR TABLE lowerCase
    Remarks
    SQL Developer: version 1.2.1, build MAIN-32.13
    Oracle DBs: 9.2 & Express
    OS: Windows 2000 Professional
    If you need any more details/testing, let me know. I'd really appreciate a quick patch for this issue...
    Alternatively, do you know of any other simple way of copying a single database (it's called a schema in Oracle, right?) from one computer to another? Possibly something so simple like detaching->copying->reattaching mdf (data) files in SQL Server... I thought that this "Export DDL&Data" function will do, but as you can see I couldn't use it.
    I just need a simple solution that works - one operation on source to stuff, get the resulting files to other computer and one operation to have it running there... I think that such scenario is very basic, yet I just can't achieve it and I am simply not allowed to spend more time on it (read: our test project fails, my company rejects my "lobbying" and stays with MSSQL :/ )
    Thanks a lot & bye

    Thanks for your reply.
    ad. 1)
    You're right. I just wanted to give some very short feedback on my experiences with SQL Developer, so I didn't think starting new threads would be necessary, but as I was writing it became much bigger than I initially planned - sorry about that. I will make proper threads as soon as possible. Having "Edit post" button on this forum would also be useful.
    ad. 2)
    Generally, you're right - in most cases it's true that "switching DBMS is a major commitment" and "you will produce terrible code" if you don't learn the new one.
    However, I think that you miss one part of market here - the market that I think Express is also targeted on. I'd call it a "fire&forget databases" market; MySQL comes to mind as possibly most common solution here. It's the rather small systems, possibly web-accessed, whose data-throughput requirements are rather modest; the point is to store data at all, and not necesarily in fastest way, because given the amount of data that is used, even on low-end hardware it will work well enough. What's important here is its general ease of use - how easy is to set up such system, connect and access data, develop a software using it, how much maintenance is needed, how easy this maintenance is, how easy are the most common development tasks as creating a DB, moving a DB from test to production server etc. There, "how easy" directly translates to "how much time we need to set it up", which translates to "how much will the development will cost".
    Considering the current technology, switching the DBMS in such systems is not necesarily a major commitment and believe me that you will not produce terrible code. In many cases it's as simple as changing a switch in your ORM toolkit: hibernate.dialect = Hibernate.Dialect.OracleDialect vs MySQLDialect vs MsSql2005Dialect
    Therefore, in some part of market it's easy to switch DBMS, even on project-by-project basis. The reason to switch will appear when other DBMS makes life easier => development faster. From that point of view, I can understand my colleagues giving me an embarassing look and saying "come on, I won't read all these docs just to have db copied to test server". And it doesn't mean "they are not willing to learn anything new", it's just that they feel such basic task should have self-explaining solution that doesn't require mastering any special knowledge. And if they get such simple solutions somewhere else, it costs them nothing to change the hibernate dialect.
    I think Oracle did the great job with introducing the Express to this "fire&forget" market. The installation is a snap, it just works out of the box, nothing serious to configure, opposite to what I remember from installing and working on Oracle 9 a few years ago. In some places it's still "you need to start SQL*Plus and enter this script", but it's definitely less than before. I also find the SQL Developer a great tool, it can do most of what we need to do with the DB, it's also much better and pleasant to use over Oracle 9 tools. Still, a few basic things still require too much hassle, and I'd say taking your schema to another machine is one of them. So I think that, if you do it well, the "schema copy wizard" you mentioned might be very helpful. If I was to give any general advice for Express line of DB/tools, I'd say "make things simple" - make it "a DB you can't see".
    That's, IMHO, the way to attract more Express users.

  • 00933 error when clicking on the data tab for the selected table

    Hi,
    I'm getting the following error when selecting a table from the tree view then clicking on the data tab:
    An error was encountered performing the requested operation:
    ORA-00933: SQL command not properly ended
    00933.00000 - "SQL command not properly ended"
    *Cause:
    *Action:
    Vendor code 933
    The exact same tables created from sqlplus for a different user in the same database work fine. Note, I'm not typing any SQL in myself. It would appear this happens on all tables created as this user. I cannot see any difference between permissions for this user and others which work but the failing user does have a "-" in the username and password which I'm suspicious of.
    This is SQL Developer v 1.2.0 (build 29.98) (the most recent I believe, Oracle Database 10g Express Edition Release 10.2.0.1.0 and Linux ubuntu.
    Thanks
    Martin

    "-" is not allowed in identifiers and is almost certainly the cause of your problem. This can be got round by quoting i.e. using "STUPID-NAME" instead of STUPID-NAME.
    I'm slightly surprised that SQLDeveloper doesn't quote the statement. Perhaps it quotes table names but not user names.

  • Log data changes for only a table

    Hi,
    We know that to read log modification from a table we need to set the client on system parameter rec/client and also set flag in settings table from transaction SE13. But we do not want to set log for all tables in system only for one for table T001B.
    Any suggestion?

    Another option would be to only make changes to T001B and accept the defaults... no other logs will be written
    You might also want to take a look into the "recclient" parameter (without "/") in the transport profile.
    Anyway, I think you are concerned about a performance urban legend? Application data generally has it's own logs (CDHDR, etc) which are not related to this, and you mostly cannot turn it off anyway.
    Cheers,
    Julius

  • Data type for an internal table var.

    hi guys
    i need a variable for labeling an internal table
    tablename may work with tables, but i need exactly the same for internal tables.
    data: DIM_TABLE TYPE TABLENAME.
    gt_ztsdhr000 is an internal table and i have 000-020,
    CASE ti_table .
        WHEN 'gt_ztsdhr000'.
          SELECT * FROM (dim_table) INTO TABLE dummy000.
          IF sy-subrc = 0.
            DELETE FROM (dim_table).
            IF sy-subrc = 0.
              INSERT (dim_table) FROM TABLE gt_ztsdhr000.
              IF sy-subrc = 0.
                MESSAGE msg  TYPE 'I'.
              ENDIF.
            ENDIF.
          ELSE. " si esta vacia solo insertamos
            INSERT (dim_table) FROM TABLE gt_ztsdhr000.
            IF sy-subrc = 0.
              MESSAGE msg  TYPE 'I'.
            ENDIF.
          ENDIF.
    And i want something like this:
    DATA: my_table like ittabname.
    my_table = 'gt_ztsdhr000'.
    CASE ti_table .
        WHEN 'gt_ztsdhr000'.
          INSERT (dim_table) FROM TABLE (my_table).

    Hello,
    Is it your problem solved ?
    One possible way also is by calling a PERFORM statement for each new address.
    Hints:
    PERFORM <GET_ADD>  USING  <&key_field_to_read_itab&>
                                   CHANGING  <&addr1&>
                                   CHANGING  <&addr2&>
    Than you can print the address.
    Regards,
    Amarjit

  • Incorrect dates imported for wage display

    Hi All
    I am working on an issue for wrong wages shown in the report, these wages are imported from memory and from DB PCL2, through RT cluster.
    For only a few employees, the date in VERSC, and RT imports previous dates say 2004, even if we are giving it as 2007.
    I checked with the exports and found that program RPCALCU0 exports the values to the Database PCL2.But there as far as I have checked the values for the current period gets exported, as I have passed that in Selection Screen.
    I am not able to find out anything as this is happening with only few PERNRs say 2-3.
    Has anyone faced this problem ever before ? or if you have any suggestions that can help me out, I shall be really grateful.
    Thanks
    Gaurav

    My favorite tool for changing dates is "A Better Finder Attributes".

Maybe you are looking for

  • Update to STB created a brick and poor customer service

    To whom it may concern,  I honestly have lost all faith in Verizon FIOS and will start pricing to replace your service first thing in the morning. This evening I spent 1.5 hours on the phone trying to get my set top box fixed. This was a total waste

  • How do I Increase a table by a percentage.

    I have a table with currency values in them B2-F41 now I have another spot which has a percentage (A1). What I need is an easy way for me to have A1's value increase or decrease the numbers that are in my table fro B2-F41 without having to go to each

  • How to add TickMark into dropdownlist?

    Hello, i am looking for dropdownlist with multiple selection. Is there a DropDownList widget allows more than one menu item to be selected at a time? i got ans from forum http://forums.adobe.com/message/3319311#3319311 they said we can add tickmark i

  • Extensions do not stay Enabled

    Hi Extensions do not stay enabled when shut down and restart DW CS4, I can enable the extensions and use them while in DW CS4. But these do not stay enable. No pre-existing software, new PC,  Vista 64bit. Tried running  both applications as administr

  • SCOM 2007 - SharePoint 2007: Maximum connections counter

    Hello, I am trying to locate the "Maximum connections counter" for SharePoint 2007 using SCOM 2007. I don't see it so far!!! "Web Service - Maximum Connections" does not show anywhere any idea? Thanks, Dom System Center Operations Manager 2007 / Syst