Table data transfer question
Hello,
Hi,
There are 2 tables accounts and rpt_accounts ( Notice that table names are different )
Accounts
id name
1 vishal
2 gufran
3 mazil
Rpt_accounts
id name
1 vishal
2 ibrahim
(Notice that id 2 has different name and there is no id 3)
The question is ---> By using Accounts table, Table Rpt_accounts should be updated at id 2 with the value "gufran" and since there is no id 3,insert values (3, mazil) into the table and id 1 should remain unchanged
This is what i told the developers, when they asked me
whether it can be done using exp/imp
1> I told them that export and import will not work because table names are different and import does not do such things like update for one row and then insert for another...this does not happen
2> They also do not want to truncate Rpt_accounts and then do full import
So they asked me for any options using Sql*Loader...
Any ideas how this could be done
Please if you know any possible solution, let me know
Thanks
Why would you be looking at external tools for this sort of thing? If you're just updating a destination table from a source table, a single MERGE statement would seem sufficient.
Justin
Similar Messages
-
Hi,
One basic question on table data transfer. We have a client dependant custom z table. When we transport the table's data also to quality and production, will the client field get automatically updated with the correct client number to which we are transporting it to or will it remain the same? E.g if we are transporting from Dev client 100 to quality client 200, will all the data record's client field be updated as 200 instead of 100? If not, please let me know how it should be done.
Thanks
Swetha.It will get automatically updated with the Target Client number.
You can check in the Customization tables. Like T001 (company Codes) has the MANDT. We create the Company codes in the customization and than transport to Quality or Producation. Those requests contains the tables entries and they will get saved with the appropriate client in the other systems.
Regards,
Naimesh Patel -
Newbie Table Data Source Question
Hi,
I have a question about table data sources. I can set them up OK but I can't find the answers I'm looking for in the documentation or online.
With a table data source, the content column is the one that's indexed right? Are the attribute column sindexed as well? If the content column is null, will the attribute columns still be indexed?
Thanks in advance
Mark.The attribute column is indexed along with the content column. However, the attribute column will not be indexed if the content column is null. That is, empty document is not indexed.
The workaround is to at least fill in a few space characters to the content column.
Steve -
Data transfer into partition table took more time
Scenario: We have a Staging table (consider it as Table A) which is refreshed Every month. Every month table A is truncated and inserted with 6 million records approx. Once the mandatory update done on Table A we have to move the records to main table (Table
B). Table B is Partitioned by month wise, which contains the complete history of records from the year 2010. Table B has 1 clus. index(month column) and 3 non clustered index. Currently table B has 180,562,235 records. While transfer i didn’t disable the index,
because it was taking more time for enabling the index again.
Question: Data transfer from table A to table B took 2.30 hours approx. Is there any way to reduce transfer time.
Any Suggestion to reduce the time will be helpful.Try with SWITCH Partition...
eg:
http://www.mssqltips.com/sqlservertip/1406/switching-data-in-and-out-of-a-sql-server-2005-data-partition/ -
HOW TO TRANSFER USER DEFINED TABLE DATA IN NEW DATABASE
Dear all , I am trying to transfer user defined table data,in new database ,can you please tell how i can do it
Thanks for Quick responce
I have Following databases
1) Targate Database EOU TEST
2) Source database STI_BLANK
Table @MACHINE
I tried But it has some error
INSERT INTO EOU TEST.@MACHINE
SELECT STI_BLANK.@MACHINE* FROM STI_BLANK.@MACHINE
Edited by: Abhijit Bhise on Apr 1, 2010 3:27 PM
Edited by: Abhijit Bhise on Apr 1, 2010 3:28 PM -
External data transfer into CO - Profitabilty Analysis tables
Gurus, Sending this again as the earlier one was not answered. Please give us some insight ASAP.
We are implementing the custom allocations to CO-PA (Profitability Analysis) records externally and trying to post close to million records into CO-PA table CE1xxxx. What is the most efficient method should be used for the posting of the externally created records into CO-PA table. Transaction KE21 is used for one entry at a time. We need to perform the mass data transfer. We also checked the BAPI BAPI_COPAACTUALS_POSTCOSTDATA. It clearly says that it is not for mass data transfer. Also, it updates the CE4xxxx table only. We need the data posted to CE1xxxx table. Any ideas!!!
There is a transaction KEFC - External data transfer to CO-PA. Has anyone used it? Please provide your insight on this transfer method.
Any suggestions are appreciated.
Thank you.Ashish,
We use KEFC on a regular basis to upload actual sales data to PA from a third system.
An upload file is created in Excel, saved as a TXT file. The structure of that excel file is equal to the structure defined in customizing: define data interface to find the structure and: define structure of external data transfer to see the respective values of the columns you need in your excel file.
Hope this works for you!
Regards, -
Transfer z-table data from ECC to CRM via Middleware
Hi,
I need to transfer some z-table data from ECC to CRM using middleware. Does someone have any tip or reference link?
AndréHi,
The following link shows the replication from CRM to ECC. The same can be followed for replication from ECC to CRM.
Replication of Z table from CRM to R/3 - No mBDoc Created
Regards,
Susanta -
BAPI - DATA TRANSFER FROM .XLS FILE TO DB TABLE
I am upload data from xls file to database table.
first i have load xls file data to internal table (it_data) that is working.
then from internal table to database table (zskv_rate) by function module (ZBAPI_RATE_SAVEDATA) which is not working.
ex.
LOOP AT IT_DATA.
Header
ZSKV_RATE-DIVN = IT_DATA-DIVN.
ZSKV_RATE-REGN = IT_DATA-REGN.
ZSKV_RATE-PLANT = IT_DATA-PLANT.
ZSKV_RATE-RATE = IT_DATA-RATE.
APPEND IT_DATA.
CALL FUNCTION 'ZBAPI_RATE_SAVEDATA'
EXPORTING
HEADDATA = ZSKV_RATE
IMPORTING
RETURN = BAPI_RETURN
TABLES
DATA = IT_DATA
IF BAPI_RETURN-TYPE = 'E'.
WRITE:/ 'Error:' ,BAPI_RETURN-MESSAGE ,'for material:' ,IT_DATA-DIVN.
ELSEIF BAPI_RETURN-TYPE = 'S'.
WRITE: 'Successfully created material' ,IT_DATA-DIVN.
ELSE.
WRITE:/ 'NONE' ,IT_DATA-DIVN.
ENDIF.
ENDLOOP.
how to transfer data from internal table to database table.Hi Sanjeev,
as u hv uploaded the excel file into the internal table,
u can update the ztable from internal table..
modify <dbtab>from itab.
regards.
Mdi.Deeba -
TDMS Data transfer Step : Long runtime for table GEOLOC and GEOOBJR
Hi Forum,
The data transfer for tables GEOLOC and GEOOBJR is taking too long (almost 3 days for 1.7 million records). There are no scrambling rules applied on this tables. Rather I am using TDMS for the first time so I am not using any scrambling rules to start with.
Also there are 30 process which have gone in error. How to run those erreneous jobs again??
Any help is greatly appreciated.
Regards,
AnupThanks Harmeet,
I changed the write type for those activities and re-executed them and they are successfully completed.
Now the data transfer is complete but I see a difference in no of records for these two tables (Geoloc and Geoobjr).
Can you please let me know what might be the reason?
Regards,
Anup -
How to transfer table data from one client to another client?
How to transfer table data from one client to another client?
Hi,
Look at the Blog
/people/community.user/blog/2007/01/07/transport-table-entries
Regards
Sudheer -
HCM TDMS data transfer phase hanging and no tables transfered
Hello SAP Colleagues / Practitioners,
Background: We are running TDMS 4.0 HCM PA PD Copy
We are now at the data transfer phase of the data copy. Add it has been already an hour since we have triggered. But the tables are still not being populated. Is there a way to check if the transfer is really processing? The job CNV_MBTCNV_MBT_PEM_START is running in control system but in the receiver system, there are no tdms work process running. is this hanging or is there a way to check if tables are really processed? Thank you.
Regards,
MeinardYou can check the transfer progress in transaction dtlmon in central system. Enter the mass transfer I'd of your package and click on execute button, in the next screen click on tab 'relevant tables', there you can see how many tables have been processed, for more detailed information change the view to calculate throughput.
-
How to transfer internal table data to Applicatiion Server in XML fromat.
how to transfer internal table data to Applicatiion Server in XML fromat.
Hi if u want to transfer from to application server using xml in
do like this,
v_xml_table is u r internal table,
and define li_xml_table as line of u r internal table
OPEN DATASET g_file FOR OUTPUT IN BINARY MODE .
LOOP AT v_xml_table INTO li_xml_table.
TRANSFER li_xml_table TO g_file .
ENDLOOP.
CLOSE DATASET g_file.
ENDIF.
<removed_by_moderator>
Regards,
Prasad.
Edited by: Julius Bussche on Jul 16, 2008 2:43 PM -
Increase of work processes for data transfer for specific tables
Hi,
We are using TDMS for transfer of around 1 month's data from our ECC Quality system to ECC test system. The test system was built using shell creation and the transfer started.
We have allocated 13 batch processes in the central system and reciever system. There are currently two tables for which transfer is running. The transfer is completed for all other tables. But eventhough there are 13 processes available, there are only 3 processes allocated to one table and 2 processes allocated to another table for transfer.
Since the time is dependent on the complete transfer of these two tables and since there are 8 free processes available, we would like to assign the remaining processes on these tables so that the transfer is faster.
Please guide me if there is any way to change the same.
Regards,
RagavHi,
Thanks to Sandeep and Pankaj for the replies.
I've started data transfer in CRM landscape and in that i was able to define the read method, write method and change some technical settings for transfer.
In the Technical settings for transfer option, i was able to increase the parallel transfer batch processes to be run in parallel.
Regards,
Ragav -
Question: New Hard Drive and Data Transfer Kit Device
Yep, I go another HDD this time for my lil Compaq Persario 700 series notebook OS Win XP. I bought a DriveWire Univeral Adapter & Data Transfer Kit. I want to know if there would be any issue transferring my iTunes Library to my new HDD? I do have some music apart of my iTunes Library that's DRM protected. Should still deactivate my account during the transfer . . . that is, if it's possible. Thank you for any assistance your can give me. Itunes 10 by the way.
From what you posted, you should be able to set the itunes media folder location to the exHD and then run the ituens consolidate library command.
That will put your itunes library content onto the exHD.
Here's Apple's article on that
http://support.apple.com/kb/ht1364
Pay attention to this note from it
+Note: If you move your library to an external drive, you'll need to power up and connect the drive before starting iTunes. If you don't, iTunes will prompt you to either locate the library or to create a new library (which will result in an empty library). No matter what, it's always a good idea to have a backup of the media you have in iTunes.+
That means you MUST have the exHD connected BEFORe starting itunes. WHat it doesn't mention is, it always needs to have the same drive letter. WIndows can change USB port drive letter without telling you it's doing it. Best practice, is to assign a permanent drive letter such as m: for Media. You can assign drive letters using windows disk manager.
There's no need to deactivate any account during this procedure - why do you think you should? -
How to read a table and transfer the data into an internal table?
Hello,
I try to read all the data from a table (all attribute values from a node) and to write these data into an internal table. Any idea how to do this?
Thanks for any help.Hi,
Check this code.
Here i creates context one node i.e flights and attributes are from SFLIGHT table.
DATA: lo_nd_flights TYPE REF TO if_wd_context_node,
lo_el_flights TYPE REF TO if_wd_context_element,
ls_flights TYPE if_main=>element_flights,
it_flights type if_main=>elements_flights.
navigate from <CONTEXT> to <FLIGHTS> via lead selection
lo_nd_flights = wd_context->get_child_node( 'FLIGHTS' ).
CALL METHOD LO_ND_FLIGHTS->GET_STATIC_ATTRIBUTES_TABLE
IMPORTING
TABLE = it_flights.
now the table data will be in internal table it_flights.
Maybe you are looking for
-
Read file when runing application only having filename and without path
Hi everyone, I am writing a simple program call ReadAP to read a text file. the text file are putting in a same directory with the Class of ReadAP when running To execute this program, in the DOS mode should input the following command which contain
-
Iphone 3G does not scroll all the e mails is it the phone or the soft ware software
having trouble scrolling a full page of e-mails, there isa blank spac at the bottom of the bottom any ideas
-
While ordering book:'Unexpected error has occurred, please try again later'
Hello, Not sure if this is a suitable forum, but when I was using Iphoto '08 V7.1.3 (latest version), and I tried to purchase an album, the following error always appeared : "Unexpected error has occurred, please try again later". It always appears a
-
While trying to update my phone one photo is stopping the whole process and I've gone through all the online suggestions (check for updates,etc)
-
Hi, I'm getting Logic Pro 7.2 on Monday. I'm installing a 7200 RPM, 100 GB drive in my Intel Macbook today. Here's my question: Which would I be better off with regarding hard drive use: 1. External Firewire or USB drive 2. Internal 7200 RPM drive us