Data pulling from R3 to BW server in BI7.0

Hi all,
I am very new in BI7.0 but know the concept of  pulling the data in BW3.5 .
I create a generic data source and replicate it in to BI7.0 .Now i am confuse in BI server how to i pull the data from R3 server Bcz each and every screen is differ from BW3.5 can anybody tell me the way for data pulling in BI 7.0 .or give me some idea abt BI7.0 data pulling steps.
Thanks and Regards.
Ankit modi.

Hi Ankit
  As of BI 7.0 there were several changes in the data loading process.
  First of all, you need to replicate the DataSource (here you can choose between use a 3.5 or 7.0 replication. I recomend the last one)
  Now you are not obligue to create an InfoSource.
  Then you have to connect your InfoProvider (DSO, InfoCube, etc) to the DataSource. This is made with a Transformation (the symbol is similar to the previous Update Rules but without an small square/dot in the graphic). Once you have created the Transformation you connect the fields of the DataSource with the InfoObject of your DataProvider.
  Finally you'll create an InfoPackage to load data from your R/3 system to PSA and then you'll create a DTP to load the data from the PSA to the InfoProvider.
Hope it helps.
Kindly regards.
Germán.-

Similar Messages

  • Tell me how to format a date retrieved from a MS SQL Server 2000 database?

    Tell me how to format a date retrieved from an MS SQL Server 2000 database for various uses in my JSP page?

    Or if you want to use JSTL instead of a scriptlet see:
    http://forum.java.sun.com/thread.jspa?threadID=676754&tstart=0

  • Data transfer from sybase to sql server

    Hi ,
    we are pulling data from sybase to sql server using ssis pacakges.
    a column of type bit in sybase is transferred as int in sql server.
    but because of that the value of sybase column as 1  is transfered as -1 in sql server till now .
    suddenly the value is now coming as 1 in sql server.
    can anybody tell me what could be going wrong here ?
    please help
    thanks in advance

    If its a bit field in sybase the corresponding datatype should be bit itself in SQLServer. Bit is boolean based data having values as only 0 and 1 (and NULL also if columns in nullable). int datatype can hold any integer data so they both are not same.
    Also a bit value of -1 is invalid and makes no sense
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • SQL Server 2005 data pull from Oracle

    Hi, I am learning how to use Oracle and I ran into this problem.
    My database is running on SQL Server 2005.
    I have an ODBC connection to Oracle. I also have a VB .Net script that queries the Oracle table that the data resides in.
    When the job is running, and if it stalls, I do not get a timeout error. It locks down my database and no other source systems can feed data to me; these are back logged in the queue.
    Is there an easier way for me to make a connection that will pull data on a consistent basis? Please help ...
    Student Learner

    Learning how to use Oracle does not include taking data from Oracle and putting it into SQL Server.
    From your description of what is happening I agree with BluShadow. This is not an Oracle issue. This is a coding in VB .NET issue.
    Posting the statements you are executing against some unknown version of Oracle would be a good starting point.

  • How do I get dates pulled from database to display in Spanish?

    I'm pulling a date from a MySQL database and need to have it
    display in Spanish. If it were in English, these are the two
    statements I would be using:
    #DateFormat(event_date,"mmmm")#
    #DateFormat(event_date,"mmm d")#
    I tried using <cfset SetLocale("Spanish (Standard)")>
    in my Application.cfm , but it didn't seem to work. I'm not really
    sure how it is supposed to be used or even if it would work if I
    were using it correctly.
    Any help would be appreciated.
    Thanks -CaddyX

    Bagger Vance wrote:
    > setLocale("es_ES"); // or whatever locale you're using
    for "Spanish"
    > // all the subsequent LS functions will use this locale
    > writeoutput("#lsDateFormat(eventDate,"FULL")#");
    >
    >
    >
    > Paul, Thanks for responding. I'm still not sure how to
    make it work. Do I just
    > put <cfset SetLocale("es_ES")> somewhere in the
    document? That didn't seem to
    > change the result in the following code:
    you can put it at the top of pages that need to be localized
    or better yet in
    your application cfm/cfc.
    >
    <cfoutput>#DateFormat(get_event.event_date,"mmmm")#</cfoutput>
    yes it won't change that, the regular cf functions all use
    en_US locale. you
    want the "LS" functions (re-read the code snippet above).

  • General question about data recovery from raid on windows server !

    hi,
    I like to buy data recovery software for windows server  to recover data with  any raid( or whitout raid
    so the company tells me you should be sure the Raid can be recognized correctly as local disk drive first, otherwise, our product  wil not help you.  so if I want to recover data of my customers which has windows server with any raid form ,how
    I can make this server harddisks(which is raid 1,5,6 ect)recognized coorectly as local disk drive.
    thanks
    johan
    h.david

    I have myself NAS which is connected to my computer through router.
    I am started smal bussines ,and I want to use also data recovery service for deleted documents,photo's,formatted drives ect.for my customers .things that accidantally errased from raid or non raid
    so I found some data recovery softwares
    1-stellarphoenix data recover,which has remote acces and netwerk acces.
    2--ease use data recovery software.  I liked this second software,but they telling me like this,
    you should be sure the Raid can be recognized correctly as local disk drive first, otherwise, our product  wil not help you.
    I understand the both situations you telling me.thanks
    but my question is can I not run data recovery software from bootable usb stick on the smal  servers like essential with raid to recover data in case of any problems that caused data loose.
    I know if I install data recovery software on the server to scan it ,it wil rewrite some places on harddisk and that is not wise.
    h.david

  • Data pulling from MARD, MARC etc. - Which table first I hv to use?

    Hi Experts,
    User enters the Matnr, plant, storage location inputs in the selections.
    It is need to get the corresponding data from MARD, MARC, MDBS, VBBE.
    So,
    1) Is it possible that, suppose MARD having 10 matnrs, where as MARC is having 15, where as MDBS have only 7? or every table does hv 10 definately?
    2) as I need to pull the data from these tables, pls. suggest me, in which order I hv to pull, I mean, first from Which table i hv to pull, and as second, wht is next table? for more reliability.
    thanq

    HI
    select matnr
              LBSTF
       from mard
    into table itab1
    where matnr = p_matnr
    condition.
    if not itab[] is initial.
    select EISBE
         from marc
      into table itab1
    for all entries in itab
    where matnr = itab-matnr
    and any other condition
    select MENGE
          from MDBS
      into table itab2
    for all entries in itab1
    where matnr = itab1-matnr condition
    endif.
    write like this
    you will get data if availble

  • Data MIgration from Oracle to SQL Server 2005

    HI Gurus,
    Kindly please advice me how to migrate Data from oracle to MS SQL Server or Vice Versa.
    I came to know about 2 methods:
    1) Using SQL Developer
    2) USing ODBC.
    KIndly let me know which method is better. I am in confusion about both option
    Kindly advice over the same
    Thanks

    Usually such questions asked and answered on forums of a target system. In this case on MS SQL forums.
    But I will answer.
    You should create a LINKED SERVER in MS SQL that connects to Oracle.
    Then issue couple of SELECT * INTO <TARGET_TABLE> FROM <ORACLE LINKED SERVER>..<SRC SCHEMA>.<SRC TABLE>.
    Install Oracle Client and OLE DB driver on SQL Server machine.
    Also, Oracle is case sensitive by default. MS SQL is case insensitive by default. If there are primary/unique keys that have mixed case values in Oracle, then in MS SQL you need to set case sensitive collation for them.
    PS. If you need not only migrate data one time, but also to have a real time replication during an application transition period, you can take a look on heterogeneous replication solutions like Golden Gates or DataCurrents.

  • Data loading from Oracle to SQL Server

    Hi,
    I am trying to Push the data from Oracle which is running in HP-UNIX to SQL Server. But don’t know the efficient way to connect the SQL server from oracle which is running on HP-UNIX.
    I have heard about oracle Heterogeneous Connectivity but don’t know exactly how to implement it in Unix environment. If you have some guide of step by step process it will be really helpful.
    Thanks in advance.
    Regards,
    Sajal

    Hello,
    please start reading here about the Oracle Database Gateways products:
    http://www.oracle.com/technetwork/database/gateways/index-100140.html
    HowTo articles are available on My Oracle Support:
    How to Setup DG4ODBC on 64bit Unix OS (Linux, Solaris, AIX, HP-UX) (Doc ID 561033.1)
    How to Setup DG4MSQL (Oracle Database Gateway for MS SQL Server) 64bit Unix OS (Linux, Solaris, AIX,HP-UX) (Doc ID 562509.1)
    Please read also this note:
    Functional Differences Between DG4ODBC and Specific Database Gateways (Doc ID 252364.1)
    The license for DG4ODBC is included in your RDBMS license, but you need to purchase a Third-Party ODBC driver for MS SQL Server.
    The license for DG4MSQL is not included in your RDBMS license.
    There is also a forum for the gateways:
    Heterogeneous Connectivity
    Regards
    Wolfgang

  • Error : Data load from relation source (SQL Server 2005) to Essbase Cube

    Hi All,
    I am looking help from you. I am trying to load data from SQLServer 2005 table to Essbase Cube using IKM SQL to Hyperion Essbase (Metadata) Modules.
    I am getting below error. Let me know if i am missing some things.
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 61, in ?
    com.hyperion.odi.essbase.ODIEssbaseException: Invalid value specified [RULES_FILE] for Load option [null]
         at com.hyperion.odi.essbase.ODIEssbaseMetaWriter.validateLoadOptions(Unknown Source)
         at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
         at java.lang.reflect.Method.invoke(Unknown Source)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
         at org.python.core.PyMethod.__call__(PyMethod.java)
         at org.python.core.PyObject.__call__(PyObject.java)
         at org.python.core.PyInstance.invoke(PyInstance.java)
         at org.python.pycode._pyx3.f$0(<string>:61)
         at org.python.pycode._pyx3.call_function(<string>)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyCode.call(PyCode.java)
         at org.python.core.Py.runCode(Py.java)
         at org.python.core.Py.exec(Py.java)
         at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Invalid value specified [RULES_FILE] for Load option [null]
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

    ODI Step: Prepare for Loading Step:
    from java.util import HashMap
    from java.lang import Boolean
    from java.lang import Integer
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    # Target planning connection properties
    serverName = "HCDCD-HYPDB01"
    userName = "admin"
    password = "<@=snpRef.getInfo("DEST_PASS") @>"
    application = "BUDGET01"
    database = "PLAN1"
    portStr = "1423"
    srvportParts = serverName.split(':',2)
    srvStr = srvportParts[0]
    if(len(srvportParts) > 1):
    portStr = srvportParts[1]
    # Put the connection properites and initialize the essbase loader
    targetProps = HashMap()
    targetProps.put(ODIConstants.SERVER,srvStr)
    targetProps.put(ODIConstants.PORT,portStr)
    targetProps.put(ODIConstants.USER,userName)
    targetProps.put(ODIConstants.PASSWORD,password)
    targetProps.put(ODIConstants.APPLICATION_NAME,application)
    targetProps.put(ODIConstants.DATABASE_NAME,database)
    targetProps.put(ODIConstants.WRITER_TYPE,ODIConstants.DATA_WRITER)
    print "Initalizing the essbase wrapper and connecting"
    pWriter = HypAppConnectionFactory.getAppWriter(HypAppConnectionFactory.APP_ESSBASE, targetProps);
    tableName = "BUDGET01_PLAN1"
    rulesFile = r"ActLd"
    ruleSeparator = "Tab"
    clearDatabase = "None"
    calcScript = r""
    maxErrors = 1
    logErrors = 1
    errFileName = r"E:\OraHome_ODI\Error\Budget01Plan1.err"
    logEnabled = 1
    logFileName = r"E:\OraHome_ODI\Error\Budget01Plan1.log"
    errColDelimiter = r","
    errRowDelimiter = r"\r\n"
    errTextDelimiter = r"'"
    logHeader = 1
    commitInterval = 1000
    calcOnly = 0
    preMaxlScript = r""
    postMaxlScript = r""
    abortOnPreMaxlError = 1
    # set the load options
    loadOptions = HashMap()
    loadOptions.put(ODIConstants.CLEAR_DATABASE, clearDatabase)
    loadOptions.put(ODIConstants.CALCULATION_SCRIPT, calcScript)
    loadOptions.put(ODIConstants.RULES_FILE, rulesFile)
    loadOptions.put(ODIConstants.LOG_ENABLED, Boolean(logEnabled))
    loadOptions.put(ODIConstants.LOG_FILE_NAME, logFileName)
    loadOptions.put(ODIConstants.MAXIMUM_ERRORS_ALLOWED, Integer(maxErrors))
    loadOptions.put(ODIConstants.LOG_ERRORS, Boolean(logErrors))
    loadOptions.put(ODIConstants.ERROR_LOG_FILENAME, errFileName)
    loadOptions.put(ODIConstants.RULE_SEPARATOR, ruleSeparator)
    loadOptions.put(ODIConstants.ERR_COL_DELIMITER, errColDelimiter)
    loadOptions.put(ODIConstants.ERR_ROW_DELIMITER, errRowDelimiter)
    loadOptions.put(ODIConstants.ERR_TEXT_DELIMITER, errTextDelimiter)
    loadOptions.put(ODIConstants.ERR_LOG_HEADER_ROW, Boolean(logHeader))
    loadOptions.put(ODIConstants.COMMIT_INTERVAL, Integer(commitInterval))
    loadOptions.put(ODIConstants.RUN_CALC_SCRIPT_ONLY,Boolean(calcOnly))
    loadOptions.put(ODIConstants.PRE_LOAD_MAXL_SCRIPT,preMaxlScript)
    loadOptions.put(ODIConstants.POST_LOAD_MAXL_SCRIPT,postMaxlScript)
    loadOptions.put(ODIConstants.ABORT_ON_PRE_MAXL_ERROR,Boolean(abortOnPreMaxlError))
    #call begin load
    pWriter.beginLoad(loadOptions)
    Execution step from opartor:
    Read rows: 0
    Insert/Delete/updat rows: 0
    ODI Step: Load Data Into Essbase
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C1_ACCOUNT "Account",C3_TIMEPERIOD "TimePeriod",C4_LOBS "LOBs",C5_TREATY "Treaty",C6_SCENARIO "Scenario",C7_VERSION "Version",C8_CURRENCY "Currency",C9_YEAR "Year",C10_DEPARTMENT "Department",C11_ENTITY "Entity",C2_DIVLOC "DivLoc",C12_DATA "Data" from OdiMapping.dbo.C$_0BUDGET01_PLAN1Data where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    stmt.setFetchSize(srcFetchSize)
    rs = stmt.executeQuery(sql)
    #load the data
    stats = pWriter.loadData(rs)
    #close the database result set, connection
    rs.close()
    stmt.close()
    ODI Step: Report Statistics
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 2, in ?
    Essbase Writer Load Summary:
         Number of rows successfully processed: 1
         Number of rows rejected: 0
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.h.y(h.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Where am i doing wronge for data load in Essbase? How I will find correct information to load data into Essbase?

  • Data pulled from RFC

    Hi,
    I want to pulled down the data from R/3 in XI using RFC Adaptor Module. What are all the different way i can acheive and pros and cons.
    if i want the push the data from R/3 to XI. then what are way i can achieve this.
    Ravichandran K

    Ravi,
    you want to send from the SAP then you have to
    invoke the RFC
    call function
    desination
    in background
    commit work.
    then all the values will go to the XI
    that you passed in your RFC call
    have you seen this weblog:
    /people/arpit.seth/blog/2005/06/27/rfc-scenario-using-bpm--starter-kit
    Also adding you may have a look at the weblog that Michal wrote,
    /people/michal.krawczyk2/blog/2005/03/29/configuring-the-sender-rfc-adapter--step-by-step
    And also this,
    /people/arpit.seth/blog/2005/06/27/rfc-scenario-using-bpm--starter-kit
    Regards
    Sreeram.G.Reddy

  • Data pull from Access

    Dear All,
               I am a BI developer.I have to get some data from Microsoft Access database to SAP BI 7.0 cube. How do I proceed beginning from recognizing the source system(Microsoft Access) in BI system.
    Regards,
    Ratish
    [email protected]

    Hi
    Refer this
    Using MS Access as a Data Source for BW
    Regards
    N Ganesh

  • Is it possible ?Change data capture from VSAM , DB2 , SQL Server  to Oracle

    Dear Professionals ,
    We plan to build a warehouse project.Source systems are
    * VSAM files in zOS
    * SQL Server
    * DB2
    Warehouse database is Oracle .
    Every night the changes in the source systems will be applied to Oracle warehouse DB.
    But only the changes will be applied . Exporting VSAM files to flat file and load to oracle and find the data differences in Oracle is not accepted. Or moving the entire tables to oracle and finding the changes in oracle is not accepted.Only the changes will pass through the network .
    Is "Oracle Connect" and "VSAM adaptor" capable of this ?
    Is there a solution for SQL Server and DB2 change data capture ?
    Is it Possible ? If possible is it a headache or a simple install and forget process?
    Thank you
    Bunyamin..

    Bunyamin,
    I do not know about VSAM, but I heard/read that Oracle Data Integrator is able to do change data capture on several databases. It uses the source database mechanism to do CDC. So maybe give it a try at the fusion middleware forum where ODI is being discussed

  • RFID Tag data retrieval from BEA RFID edge server.

    Hello All,
    I am working on the rfid appln.
    When i try to get the tag data from the reader, by default the tag data comes in the following format,
    <rawHex>urn:epc:raw:96.x964002000000000000000001</rawHex>;
    which has urn:epc:raw:96. appended to it. but we are interested only in the EPC number i.e. 964002000000000000000001
    Can some one please tell me how do we achieve this.
    I am executing the ImmediateSample. java file.
    Any help would be highly appreciated.
    Thanks,

    I have the same issue .
    Its very urgent for me to resolve. It s urgent

  • Max data pull from Virtual Cube - is this a setting?

    We have a user doing a query against a Remote cube in our BW system, and they're hitting a "maximum data" limit of data from this remote cube.  Is this a setting for this cube or globals, and can you modify it?
    Thanks,
    Ken Little
    RJ Reynolds Tobacco

    Hi,
    MAXSIZE = Maximum size of an individual data packet in KB.
    The individual records are sent in packages of varying sizes in the data transfer to the Business In-formation Warehouse. Using these parameters you determine the maximum size of such a package and therefore how much of the main memory may be used for the creation of the data package. SAP recommends a data package size between 10 and 50 MB.
    https://www.sdn.sap.com/irj/sdn/directforumsearch?threadid=&q=cube+size&objid=c4&daterange=all&numresults=15
    MAXLINES = Upper-limit for the number of records per data packet
    The default setting is 'Max. lines' = 100000
    The maximum main memory space requirement per data packet is around
    memory requirement = 2 * 'Max. lines' * 1000 Byte,
    meaning 200 MByte with the default setting
    3 THE FORMULA FOR CALCULATING NUMBER OF RECORDS
    The formula for calculating the number of records in a Data Packet is:
    packet size = MAXSIZE * 1000 / transfer structure size (ABAP Length)
    but not more than MAXLINES.
    eg. if MAXLINES < than the result of the formula, then MAXLINES size is transferred into BW.
    The size of the Data Packet is the lowest of MAXSIZE * 1000 / transfer structure size (ABAP Length) or MAXLINES
    Goto RSCUSTV6 tcode and set it.
    Go to your Infopackage, from tool bar, scheduler, data packet settings, here you can specify your data packet size
    Go to R/3, Transaction SBIW --> General settings --> Maintain Control Parameters for Data Transfer.
    Here you can set the maximum number. But the same can be reduced in BW...
    Info package>Scheduler>data’s default data transfer-->here you can give the size, but can reduce the size given in R/3 side, you can’t increase here...
    In RSCUSTV6 you can set the package size...press F1 on it to have more info and take a look to OSS Notes 409641 'Examples of packet size dependency on ROIDOCPRMS' and 417307 'Extractor Package Size Collective Note'...
    Also Check SAP Note 919694.
    This applies irrelevant of source system meaning applicable for all the DS:
    Go To SBIW-> General Settings -> Maintain Control Parameters for Data Transfer -> Enter the entries in table
    If you want to change at DS level then:
    IS->IP -> Scheduler Menu -> Data’s. Default Data Transfer and change the values.
    Before changing the values keep in mind the SAP recommended params.
    Hope this helps u..
    Best Regards,
    VVenkat..

Maybe you are looking for