Difficult problems in Data integration

Hi:
I want to use Format Builder to create MFL file to translate flat file to xml.
But how can I define the attribute of the element?
another problem is if every three lines indicate the whole information, how to
define this scenario?

I have done the things mentioned by you.
Still it is giving the same problem when I look into the monitor viz. 'Error in Data selection'
The details show ' Selected Number doe not agree with Transferred number'
When I look into the PSA records it seems to be pulling the same set of 25 records multiple times in multiple packages ( 16 packages of 25 records each) and then failing.
Any idea why this is happening. IS there a parameter I need to set somewhere.
Thanks

Similar Messages

  • Diffcult problems in Data integration

    Hi: <br>
    I use Format Builder to create MFL file to translate flat file to xml. But
    how can I define the mapping from binary data to an attribute of the XML element?
    Does it only support element?<br>
    Another problem is if every n lines(n is defined in the first line) indicate
    one complete information for an entity, how to do it in this scenario?

    I have done the things mentioned by you.
    Still it is giving the same problem when I look into the monitor viz. 'Error in Data selection'
    The details show ' Selected Number doe not agree with Transferred number'
    When I look into the PSA records it seems to be pulling the same set of 25 records multiple times in multiple packages ( 16 packages of 25 records each) and then failing.
    Any idea why this is happening. IS there a parameter I need to set somewhere.
    Thanks

  • Problem with data integration when using KCLJ

    Hello,
    For a project, I had to integrate a new field using transaction KCLJ.
    For this I extented the DDIC structure of the sender structure, and after that, I updated the corresponding transfer rules.
    When I execute transaction KCLJ I have no error, and table BUT000 is updated with the data of the flat file.
    The problem is that erase also 6 BUT000's fields; they're not in the sender structure and so, have no transfer rules.
    Could you help me ?

    Hi
    Please read this.
    External Data Transfer
    These activities are not relevant if you use a CRM/EBP system.
    In the following activities you make definitions for transfer of business partner data or business partner relationship data from an external system to a SAP System.
    Data transfer takes place in several stages:
    1. Relevant data is read from the external system and placed in a sequential file by the data selection program. The data structure of the file is defined in the sender structure.
    This procedure takes place outside of the SAP environment and is not supported by SAP programs. For this reason, data changes can be made at this point by the data selection program.
    2. The sequential file is stored on an application server or a presentation server.
    3. The SAP transfer program reads data from the file and places this in the sender structure. This does not change the data. This step is carried out internally by the system and does not affect the user.
    4. Following transfer rules that have to be defined, the transfer program takes the data from the sender structure and places it in the receiver structure. During this step you can change or convert data.
    The receiver structure is firmly defined in the SAP system. Assignment of the sender structure to the transfer program, and of the transfer program to the receiver structure is made using a defined transfer category.
    5. The data records in the receiver structure are processed one after the other and, if they do not contain any errors, they are saved in the database.
    Before you transfer external data for the first time, make the following determinations:
    The structure of the data in the external system may not match the structure expected by the SAP system. You may have to supplement data.
    There are two ways in which you can adapt the structure:
    You make the required conversions and enhancements within the data selection program prior to beginning the transfer to the SAP system. This will be the most practical solution in most cases since you have the most freedom at this point.
    You do the conversion using a specially developed transfer program and transfer rules.
    You then define the fields of the sender structure. The system offers you the option of automatically generating a sender structure that is compatible with the receiver structure.
    You define transfer rules to create rules according to which the fields of the sender structure are linked with those of the receiver structure.
    You now carry out the transfer.
    SAP Enhancements for External Data Transfer
    The following SAP enhancements are offered in the following areas of External Data Transfer:
    Four Customer Exits exist for the data transfer or for the conversion from IDOC segments. The Exits are contained in the enhancement KKCD0001. As soon as the Customer Exits are activated, they are carried out for all sender structures or segments. The first two Customer Exits require minimal coding once they are activated. The sender structure concept is used when loading data into the SAP-System. The concept Segment is used in the context of the distribution of the SAP-System. It is a matter of a record of data to be transferred or converted. It is recommendable to code a CASE -instruction within the Customer Exit, where (differentiated according to sender structure (REPID) or segment) various coding is accessed. In the parameter REPID, the name of the segment for the conversion from IDOC segments. The parameter GRPID is not filled out with the conversion from IDOC segments. You should have a WHEN OTHERS branch within the CASE instruction, in which the 'SENDER_SET' is allocated to the 'SENDER_SET_NEW' or the 'RECEIVER_SET' to the 'RECEIVER_SET_NEW'. Utherwise the return code will have its initial value. You can view a possible solution in Code sample.
    The first Customer Exit is accessed before the summarizing or conversion. It is called up as follows:
    CALL CUSTOMER-FUNCTION '001'      EXPORTING            GRPID          = GRPID       "Origin            REPID          = REPID       "Sender program           SENDER_SET     = SENDER_SET  "Sender record      IMPORTING           SENDER_SET_NEW = SENDER_SET  "modified sender record            SUBRC          = SUBRC.      "Returncode
    If the variable 'SUBRC' is initial, the modified record is edited further or else passed over. The import parameter 'SENDER_SET_NEW ' must be filled out in the Customer Exit, as only this field and not the field 'SENDER_SET is further edited. This also especially means that you must allocate the import parameter 'SENDER_SET_NEW' the value of 'SENDER_SET' for records, for which no special handling will be carried out otherwise.
    The second Customer Exit is accessed after the summarization and before the update:
    CALL CUSTOMER-FUNCTION '002'   EXPORTING     REPID            = REPID           "Senderprogramm     GRPID            = GRPID           "Herkunft     RECEIVER_SET     = RECEIVER_SET    "verdichteter Satz   IMPORTING     RECEIVER_SET_NEW = RECEIVER_SET    "modifizierter verdichteter Satz     SUBRC            = SUBRC.          "Returncode
    The modified record is only updated if the variable 'SUBRC'
    is initial.
    The import parameter 'RECEIVER_SET_NEW' must be filled out in the Customer Exit, since only this field and not the field 'RECEIVER_SET _NEW' is updated.
    The third Customer Exit is used for replacing variables. It is called up when you load the transfer rules.
      CALL CUSTOMER-FUNCTION '003'     EXPORTING       REPID = REPID       GRPID = GRPID       VARIA = VARIA       RFELD = RFELD       VARTP = VARTP     CHANGING       KEYID = KEYID     EXCEPTIONS       VARIABLE_ERROR = 1.
    The parameters REPID and GRPID are supplied with the sender structure and the origin. The variable name is in the field VARIA. The name of the receiver field is in the parameterRFELD. Field VARTP contains the variable type. Valid types are fixed values of the domain KCD_VARTYP. You transfer the variable values in the parameter KEYID. If an error occurs you use the exception VARIABLE_ERROR.
    the fourth Customer Exit is required in EC-EIS only. It is called up after the summarization and before the determination of key figures. It is a necessary enhancement to the second Customer Exit. This is because changes to the keys are considered before the database is checked to see if records exist for the keys.
    The function is called up as follows:
    CALL CUSTOMER-FUNCTION '004' CHANGING    RECEIVER_SET = R    SUBRC = UE_SUBRC.
    The parameter RECEIVER_SET contains the receiver record to be changed. The parameter RECEIVER_SET is a changing parameter. No changes must be made to the function module if it is not used.
    The User-Exits can be found in the Module pool 'SAPFKCIM'. If you want to use the Customer Exits, you can create a project and activate the Customer Exits with the transaction 'CMOD'. The enhancement which you must use with it is KKCD0001.
    Note that when programming customer exits, that these will also run if corrected data records are imported into the datapool within the context of post processing for both test and real runs.
    I will provide some pointers soon. Give me some time.
    Hope this will help.
    Please reward suitable points.
    Regards
    - Atul

  • Problem in Scheduling a Package/Interface in Oracle data Integrator.

    Hi all,
    I have a problem in scheduling in odi.
    I have followed the steps:-
    1) launch a scheduler agent from commnadline using the command
    agentscheduler "-port=20300" "-v=5"
    2.)created a Physical Agent and Logical Agent on this port
    3.)and creating a scenario for the packege and scheduling it.
    But scheduling is not running
    Pls provide me the solution.is any step missing or Am i wrong anywhere?
    Message was edited by:
    user567803

    Hi,
    I have read this thread, and seem the solution you have mentioned may work for me also. But whenever I am trying to laun agent scheduler I am getting following errors
    OracleDI: Starting Scheduler Agent ...
    Starting Oracle Data Integrator Agent...
    Version : 10.1.3.2.0 - 03/01/2007
    com.sunopsis.tools.core.exception.g: java.sql.SQLException: socket creation error
    at com.sunopsis.dwg.cmd.n.a(n.java)
    at com.sunopsis.c.f.run(f.java)
    at com.sunopsis.dwg.cmd.i.y(i.java)
    at com.sunopsis.dwg.cmd.i.run(i.java)
    at java.lang.Thread.run(Thread.java:595)
    Caused by: java.sql.SQLException: socket creation error
    at org.hsqldb.jdbc.jdbcUtil.sqlException(jdbcUtil.java:67)
    at org.hsqldb.jdbc.jdbcConnection.<init>(jdbcConnection.java:2451)
    at org.hsqldb.jdbcDriver.getConnection(jdbcDriver.java:188)
    at org.hsqldb.jdbcDriver.connect(jdbcDriver.java:166)
    at com.sunopsis.sql.SnpsConnection.u(SnpsConnection.java)
    at com.sunopsis.sql.SnpsConnection.c(SnpsConnection.java)
    at com.sunopsis.sql.h.run(h.java)
    Caused by:
    java.sql.SQLException: socket creation error
    at org.hsqldb.jdbc.jdbcUtil.sqlException(jdbcUtil.java:67)
    at org.hsqldb.jdbc.jdbcConnection.<init>(jdbcConnection.java:2451)
    at org.hsqldb.jdbcDriver.getConnection(jdbcDriver.java:188)
    at org.hsqldb.jdbcDriver.connect(jdbcDriver.java:166)
    at com.sunopsis.sql.SnpsConnection.u(SnpsConnection.java)
    at com.sunopsis.sql.SnpsConnection.c(SnpsConnection.java)
    at com.sunopsis.sql.h.run(h.java)
    Thanks in advance
    HA

  • License Problem When Installing Data Integrator

    Hi,
    I have downloaded the BOBJ DATA INTEGRATOR XI R2 ACC.
    I am trying to install on the server that has our BO XI Edge 3.1 installation.
    Its asking me for a Licence file??
    The DI Getting Started Guide refers to licensing Business Object email address. I got a bounce back saying the mailbox wasnt monitored anymore.
    I only want to evaluate the product, is sap Account Manager the only person who can give this out. Its just that my boss is away and he is the one who talks with the Account Manager!
    Thanks
    Carston

    I got the link from the sap marketplace
    https://websmp101.sap-ag.de/~form/handler?_APP=00200682500000001943&_EVENT=DISPHIER&HEADER=N&FUNCTIONBAR=N&EVENT=TREE&TMPL=INTRO_SWDC_IU_BOBJ&V=INST

  • Finding Error while creating Data Integrator Repositor

    Hi,
    I am working in SAP BO Data Integrator. I have created databeses and its login. Installed SQL Server and while installing Data Integrator I got an error for creating a new repository which is as follows
    Cannot open connection to the repository.  The error message from the underlying DBMS is <ODBC call <SQLDriverConnect> for data source <MSSERVER\SQLEXPRESS> failed: <[Microsoft][ODBC SQL Server Driver][SQL Server]Login failed for user 'AIO_REPO_IDOC'. The user is not associated with a trusted SQL Server connection.>. Notify Customer Support.>. (BODI-20006)
    Can anyone resolve this problem.
    Edited by: sap_beginnner on Aug 9, 2010 4:10 PM

    Hi,
    I used SQL Server Authentication to logon to databases and for DI Version I am using SAP BusinessObjects XI 3.2.
    I tried it again by deleting all databases and then create the same databases but on creating repository for table AIO_REPO_IDOC in repository manager gives error which is as follows
    Cannot open connection to the repository.  The error message from the underlying DBMS is <ODBC call <SQLDriverConnect> for data source <IDHASOFT238933\SQLEXPRESS> failed: <[Microsoft][ODBC SQL Server Driver][SQL Server]Login failed for user 'AIO_REPO_IDOC'. The user is not associated with a trusted SQL Server connection.>. Notify Customer Support.>. (BODI-20006) 
    An error occurred during creation of the local repository. (BODI-300054)
    I also tried through another tool to acces the databases. It is working successfully.
    Edited by: sap_beginnner on Aug 10, 2010 8:58 AM

  • Performance degradation in Data Integrator Designer with Multi-user mode

    Post Author: ccastrillon
    CA Forum: Data Integration
    We are developing an information system based on a DataMart populated with data using ETL processes built with Data Integrator. We are three people developing and when we work at the same time we begin to have problem of performance with Designer. Designer seems freeze sometimes and development process becomes painful.
    Where is the problem? accessing repository? Is any known bug?
    Job Server? but it happens even when we don't launch any job. Only building ETL processes manipulating objects (Dataflows, workflows,...etc).
    We would appreciate  any help. Thanks in advance.
    Carlos Castrilló

    Post Author: bhofmans
    CA Forum: Data Integration
    What do you mean with 'working at the same time' ? You need 3 different repositories if you want to work with 3 developers, so there should be no impact at all when working simultaniously...
    -Ben.

  • Data Integrator has no read permissions to get the data file using FTP

    Hi,
    I wonder if you could help.
    We have installed the data integrator and are using FTP to get the data file from the SAP server to the DI server. The files are created by the SAP admin user with permissions 660. The FTP user is not in the sapsys group so cannot read the files.
    Has anyone come accross this problem and how did you solve it?
    Many thanks.

    Hi,
    you might want to put you entry into the EIM forum where also the Data Integrator topics are being handled:
    Data Services and Data Quality
    Ingo

  • Using the cache in Data Integrator

    Hi,
    I'm trying to understand if is possible restrict the use of cache in Data Integrator to load initial datasource to limit the use of server resources.
    I understand from the manual to set the option to use the type of dataflow cache: Pageable or In-Memory. This option does not solve my problem. I would like to avoid to load all the cached datasource.
    Is possible to perform query objects directly without first loading all data sources in the tables?

    base /open SQL Statement etc) the first time i defined the system everything was fine but when i changed the Database (using M.S.Access) the " open SQL Statement" it would show the tables but not thier columns ,I"m using win98 sec edition / Teststand 1.0.1Hello Kitty -
    Certainly it is unusual that you can still see the tables available in your MS Access database but cannot see the columns? I am assuming you are configuring an Open Statement step and are trying to use the ring-control to select columns from your table?
    Can you tell me more about the changes you made to your file when you 'changed' it with MS Access? What version of Access are you using? What happens if you try and manually type in an 'Open Statement Dialog's SQL string such as...
    "SELECT UUT_RESULT.TEST_SOCKET_INDEX, UUT_RESULT.UUT_STATUS, UUT_RESULT.START_DATE_TIME FROM UUT_RESULT"
    Is it able to find the columns even if it can't display them? I am worried that maybe you are using a version of MS Access that is too new for the version of TestSt
    and you are running. Has anything else changed aside from the file you are editing?
    Regards,
    -Elaine R.
    National Instruments
    http://www.ni.com/ask

  • Regarding data integrity for XI file/jdbc adapter

    I have several questions about XI data integrity for file/jdbc adapter.
    Question 1 is: if the destinate file or DB is not reachable or other technical problems occur, do I have to send the message again or not?
    Usually we design  an bpm to comfirm whether the database or file is completed according to a response message. But it's a little complicated.
    Question 2:when I use jdbc send adapter and configure select and update SQL clauses. Does XI only update the records which it select automatically? or Select and Update don;t have any relationship?
    Very appreciated if you have any experience to share me.
    Regards
    Shen Peng

    Hi
    Question 1 is: if the destinate file or DB is not reachable or other technical problems occur, do I have to send the message again or not?
    If the DB is not reachable then message will be found in the inbound side. you can found them in the adapter. Goto RWB -> Message monitoring -> select status System error and search . there you will find your messages. select you message and resend them
    Question 2:when I use jdbc send adapter and configure select and update SQL clauses. Does XI only update the records which it select automatically? or Select and Update don;t have any relationship?
    XI never does updation automatically, Update is done only based on the SQL update query which you have written in the  sender channel.
    SELECT statement is used just to pick the values from the table and based upon the query which you want to UPDATE in SQL . data is updated
    Regards,
    sandeep

  • Data Integrity Manager & Synchronize Objects

    Hi there,
    I have 2 questions regading data integrity and synchronization. If any of you can help me out, I would really appreciate it.
    1) Synchronize Objects (R3AS4): Is this only used to sync the customizing objects or can it be used to sync master data objects as well? When I run this transaction, I get the message tat "Synchronization Ware not activated". Do any of you know how to activate this?
    2) Data Integrity Manager: How do I choose betweeen a header level comparison and detail comparison. When I try to run it , nothing seems to happen, and I just get the message "0 objects are equal" and "0 objects are not equal" when I run it. I am just using the DIMA wizard to start the comparison. Do any of you have the "Data Integrity Manager Cookbook"? If so, can you please send ot to me?
    Thanks in advance
    Max

    Hello Max,
    Here are the steps that I just did in my system:
    - transaction SDIMA
    - New DIMa Instance wizard
    - name = customer / object = customer
    - RFC destination = <R/3> / filter mode = All filter / flag to start compare on wizard completion
    - no filter settings
    - complete
    So the compare start as soon as the wizard is finished.
    You can see the status of the job in the right of the screen. (it takes some time for BP). When the status is green, the job is finished.
    Maybe you will have the message "0 objects are equal, 0 objects are not equal" but you should also have just above a message like "421 object(s) exist in both systems".
    I don't think that the fact that you are not sending BP back to R/3 is a problem to use DIMA.
    Hope this will help you.
    Regards,
    Frédéric

  • Need a Data Integration for multiple ERP systems

    We are doing some research into a data integration layer to our BW 7.3/BOBJ 4.0 from multiple ERPs.  Of course we are looking at Data Services and Information Steward in the BOBJ suite but just looking for anybody's recommendations on the topic.
    What technology platform do you use to do the extract, transform, load processes from multiple backend sources into BW?  Any you would advise us to avoid? 

    Hello Edward,
    The answer depends on multiple factors.
    Some pointers:
    Volume and growth of db in scope planning (Federation vs replication )
    If data federation is where you want to move data services / BO tools will be ideal
    If your data is coming from multiple ERP systems you can utilize there delta queue to load data via data services (in case of SAP)
    Use native DB connect/UD connect functionalities in BW as with BW 7.3 its delta capable.
    Moving to BW 7.4 you will have SDA to solve that problem of integration to Non - SAP data into your EDW landscape
    These are pointers but I would say talk to enterprise architects and look into the foresight of wheyare you want to move your EDW platform.
    Cheers!
    Suyash

  • Problem in data loading through UD Connect.

    Hello All,
    i have problem in data loading...I have created the UD connect source systen and its working properly,then i have created UD Connect InfoSources and DataSources, But when i am trying to create Infopackages then its giving following error..
    "Error occurred in the source system" and "no record found".
    Thanks
    Shivanjali

    Hello Shivanjali,
    Mostly UDC is used for RemoteCube to access data outside of SAP. What is your external system? Make sure that there is data in the source system.
    Please check following links
    [Transferring Data with UD Connect|http://help.sap.com/saphelp_nw04s/helpdata/en/43/e35b3315bb2d57e10000000a422035/frameset.htm]
    [SAP BW Universal Data Integration SAP NW Know-How Call|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f094ba90-0201-0010-fe81-e015248bc5dd]
    [SAP BW and ETL A Comprehensive Guide|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/11e1b990-0201-0010-bf9a-bf9d0ca791b0]
    Thanks
    Chandran

  • Text fields being converted to Long in Data Integrator v 11.7

    I am a new user to BODI, and I recently experienced an issue with source "TEXT" fields being converted to "LONG" in the Query and Target datastores respectively. I am using Data Integrator v 11.7 and my question is does anyone know whether or not this issue has been resolved in Data Services XI 3..x? if so, can you also provide documentation on this exact issue? Thanks in advance.
    Lonnie

    Yes, we did some changes in DI 12.0. But what is the problem? A SQL Server TEXT datatype is a text with unlimited length whereas a varchar(maxsize) is limited. In Oracle a CLOB is the same thing. And in DI we call a text of unlimited size a LONG datatype.

  • Data Integration Between Oracle Bases

    Hi everybody,
    I am an oracle database and now I'm experiencing something new in my job. I have to define the way my company will perform a data integration.
    My situation is this: Here, people are constructing a new, huge system called MEGA with its own Oracle Database. But, there are 20 other smaller systems, each one with their own Oracle Database that will share some data with MEGA Database.
    Here the most commom situation is that some data must be synchronized in x in x hours between the main MEGA database and the other smaller databases. But some data must be synchronized automatically to provide the online data integrity.
    My doubt is: what's the best way to do this integration? Do I have to use Materialized View, Triggers, procedures or is there a tool that comes with Oracle Database that I can use to simplify this data integration?
    Do I have to create a Database to keep the transactional data?
    If someone can give me any hints, I'll be very thankfuled....
    Bye, bye...

    Hi Justin,
    Thank you for your reply.
    But I'm still in doubts...
    If I use the "on commit" refresh type I will solve the problem relative to the staleness of data with materialized view solution, isn' it?
    Because, whenever I commit the base tables, the materialized view will be refreshed automatically, am I wrong?????
    Another doubt is: if I use Materialized Views, do I have to create a "temporary" database to keep the transactional data? Or should I keep the materialized views inside the original and destination databases???
    What do you prefer in this case of mine: use Materialized Views ou Trigger to reflect the change made in the legacy database to the MEGA (central) database????
    I will be waiting for your reply.

Maybe you are looking for

  • Freight condition calculated on header level , not at item level

    Hi, I am creating PO with 2 line items.. mat X qty 1 price 100 INR mat Y qty 1 price 100 INR Now i have given freight as 500 INR in header level... My prob is freight is getting calculated based on number of  item level... It means if i have 2 item l

  • CAN THE MS OFFICE FILES MOVED TO TRASH BE  RETRIEVED

    I JUST DOWNLOADED APPSCLEANER FROM NET AND ACCIDENTALLY REMOVED SOME FILES OF MS OFFICE  TO TRASH. NOW SOME OF THE APPLICATIONS ARE NOT OPENING OR WORKING PROPERLY AND SHOWING EROR MESSAGE. I TRIED BY DRAGGING THE DELETED FILES/ITEMS TO THE DESKTOP B

  • Procedure to know No of delta records

    Hi All, There were some changes done in CRMPRD for the DS:Business datasource. So we want to compare the changes is DELTA records in CRM loaded to BI server. Now, Is there any way to know the no. of delta records in the R/3 production(here CRM) for p

  • Business Partner Error in Replication Due to Organizational Structure

    Hi, I tried to replicate one customer from R/3 to CRM, however it resulted to an error saying Sales office O 50000620 sales group O 50000728 not maintained for sales area O 50000613 RM. So i changed the Org Structure in PPOMA_CRM to maintain the org

  • Dbca does not follow the maxsize values within .dbc for the tablespaces

    Does someone nows if dbca has some sort of bug when reading a .dbc file and following what is writen there? Here is a tablespace example where maxsize simples is not considered: <Name id="2" Tablespace="UNDOTBS1" Contents="UNDO" Size="26" maxsize="2G