Selection for data load

hi friends,
I have a datasource which contained 25 fields, in that 25 fields, we have selected only 4 fields as key fields.this datasource used in a ods and all are work fine..... so, it was moved into production..
now due to the data capacity in production, we want to load data by a selection of date which is not a key of a datasource.... we need initial in BW side, how can we get data for certain period only..... guide me and help me...
with hopes,
Jaya

Hi Jaya,
I hope u have data in ur source..
and I hope u have any date field as selection in ur infopackage.....
if u have date field(fiscal year period, cal month like this) in the selection tab of Infopackage then u can use them for loading to ur DSO...... This way u can do the multiple loads into ur DSO.. and u ensure that there will be less load in the source as well...
take these selective loads as repair full into ur DSO.. once after this u can perform the init-without data tranfer..
and the proceed with the deltas regularly later on..
thanks
Assign points if thishelps

Similar Messages

  • Sample SOAP request for Data Loader API

    Hi
    Can anyone please help me out in giving a sample SOAP request for Data Loader API .This is to say import 1K records from my system to the CRM instance I have .

    Log into the application and then click on Training and Support there is a WS Library of Information within the application

  • Missing Standard Dimension Column for data load (MSSQL to Essbase Data)

    This is similar error to one posted by Sravan -- however I'm sure I have all dimensions covered -- going from MS SQL to SunOpsys Staging to Essbase. It is telling me missing standard dimension, however I have all accounted for:
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last): File "<string>", line 23, in ? com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    I'm using multiple time period inputs -- BegBalance,Jul,Aug,Sep,Oct,Nov,Dec,Jan,Feb,Mar,Apr,May,Jun (target has all of those in place of Time Periods)
    I'm using hard coded input mapping for Metric, Scenario, Version, HSP_Rates and Currencies. -> 'Amount', 'Actual', 'Final', 'HSP_InputValue','Local' respectively.
    The only thing I can think of is that since I'm loading to each of the months in the Time Periods dimension (the reversal was set up to accomodate that)... and now its somehow still looking for that? Time Periods as a dimension does not show up in the reversal -- only the individual months named above.
    Any ideas on this one??

    John -- I extracted the data to a file and created a data load rule in Essbase to load the data. All dimensions present and accounted for (five header items as similar here) and everything loads fine.
    So not sure what else is wrong -- still getting the missing dimension error.
    Any other thoughts?? Here's the entire error message. Thanks for all your help on this.
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 23, in ?
    com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
         at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
         at java.lang.reflect.Method.invoke(Unknown Source)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
         at org.python.core.PyMethod.__call__(PyMethod.java)
         at org.python.core.PyObject.__call__(PyObject.java)
         at org.python.core.PyInstance.invoke(PyInstance.java)
         at org.python.pycode._pyx8.f$0(<string>:23)
         at org.python.pycode._pyx8.call_function(<string>)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyCode.call(PyCode.java)
         at org.python.core.Py.runCode(Py.java)
         at org.python.core.Py.exec(Py.java)
         at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.k(e.java)
         at com.sunopsis.dwg.cmd.g.A(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Caused by: com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
         at com.hyperion.odi.essbase.ODIEssbaseDataWriter.validateColumns(Unknown Source)
         ... 32 more
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.k(e.java)
         at com.sunopsis.dwg.cmd.g.A(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

  • Extraction problem - selection conditions for data load using abap program

    Hi All,
           I have a problem loading data over a selected period where the selection of date range is done using ABAP routine (type 6). Here though in the request header tab in monitor screen i'm able to see the selection date range populated correctly, no records are being extracted. But if i delete the abap filter and directly give the same date range for selection we are able to extract data. if any body has faced similar problem and have a solution for it please help me with yur suggestion.
    Thanks,
    nithin.

    It seems the the data range is not properly set in the routine.
    You can check the value of selection period generated by routine in the data selection tab-> execute button is there .
    Click it to to test the selection values generated by the ABAP routine..
    If the value here seems correct one then paste the code of the routine that u have written with brief logic details that u have applied.
    Sonal.....

  • Takes Long time for Data Loading.

    Hi All,
    Good Morning.. I am new to SDN.
    Currently i am using the datasource 0CRM_SRV_PROCESS_H and it contains 225 fields. Currently i am using around 40 fields in my report.
    Can i hide the remaining fields in the datasource level itself (TCODE : RSA6)
    Currently data loading takes more time to load the data from PSA to ODS (ODS 1).
    And also right now i am pulling some data from another ODS(ODS 2)(LookUP). It takes long time to update the data in Active data table of the ODS.
    Can you please suggest how to improve the performance of dataloading on this Case.
    Thanks & Regards,
    Siva.

    Hi....
    Yes...u can hide..........just Check the hide box for those fields.......R u in BI 7.0 or BW...........whatever ........is the no of records is huge?
    If so u can split the records and execute............I mean use the same IP...........just execute it with different selections.........
    Check in ST04............is there are any locks or lockwaits..........if so...........Go to SM37 >> Check whether any Long running job is there or not.........then check whether that job is progressing or not............double click on the Job >> From the Job details copy the PID..............go to ST04 .....expand the node............and check whether u r able to find that PID there or not.........
    Also check System log in SM21............and shortdumps in ST04........
    Now to improve performance...........u can try to increase the virtual memory or servers.........if possiblr........it will increase the number of work process..........since if many jobs run at a time .then there will be no free Work prrocesses to proceed........
    Regards,
    Debjani......

  • Underlying RS Table :: Date & Time for Data Load Requests

    Dear SAP BW Community,
    In BW 3.5, does anybody know the underlying "RS" table where I can check to see the earliest date & time which a data target was loaded, by providing the data target's technical name in SE16 ?
    Thanks!

    OK, I've found the timestamp the data load requests in the table RSMONICDP.
    To get the earliest data load for infoCube FRED, I'm going to Oracle via SQL*Plus, as follows:
    select to_char(min(TIMESTAMP)) from sapr3.RSMONICDP where ICUBE = 'FRED' ;

  • Increase the number of background work processes for data load performance

    Hi all,
    There are 10 available background work processes in the BW system. We're doing some mass load to multiple ODS.But system uses only 3 background processes. How can i
    increase the number of used background work processes for new data load.
    I tried to change number of prosesses with RSODSO_SETTINGS. But no successes. Are there any other settings need to change?
    thanks,
    Yigit

    Hi sankar,
    I entered the max proc. number into ROIDOCPRMS. But it doesn't make difference. System still uses only 3 of background processes. RSCUSTA2 is replaced with
    RSODSO_SETTINGS in BI 7.0 and this trans. can only change the processes for data activation, SID generation and rollback. I need to change the process numbers for data extraction.

  • Where Clause in Table Lookups for Data Load

    Hello,
    In Shared Components I created in Data Load Table. In this Data Load Table I added a Table Lookup. On the page to edit the Table Lookup, there is a field called Where Clause. I tried to add a Where Clause to my Table Lookup in this field but it seems that it has no effect on the Data Load process.
    Does someone know how to use this Where Clause field?
    Thanks,
    Seb

    Hi,
    I'm having the same problem with the where clause being ignored in the table lookup, is this a bug and if so is there a work around?
    Thanks in advance

  • Add Data Source for Data Load Rule

    I created an ODBC system data source in both the 32-bit and 64-bit ODBC managers (just to be sure), but they're not showing up as options for data sources in the SQL interface in EAS. Is there some step that I'm missing?
    Thanks.

    Have you created the ODBC on the Essbase server or on the server hosting EAS, it should be on the Essbase server.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Suggest good strategy for data load through standard datasources

    Hi BW Gurus,
    we currently are using standard purhasing related datasources. We forsee new reports coming in later based on the standard datasources.
    Can you please suggest a good general startegy to follow to bring in R/3 data. Our concerns are towards data loads [ initializations etc..] as some of the standard datasources are already in production.
    please advice.

    Hi
    go through these web-blogs -  From Roberto Negro it may help you.
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    Regards,
    Rajesh.

  • Last day of previous month for data load

    Hi,
    I have to load data from the previous month into the psa and then into an infocube. I was wondering as to how to get the last of the previous month to write a code in ABAP. I will be writing the code at the infopackage level in the data selection. I could load data from the 1st of the previous month to the 1st of the current month. This will be an additional load of 30,000 records for the 1st of every month, since I will be loading 30,000 records everyday, I was wondering if I could limit the load from the 1st of every previous month to the last day of that month. This will be a repetitive loading.
    DATA: CURR_MM(2) TYPE N,
    CURR_YYYY(4) TYPE N,
    CURR_DD(2) TYPE N,
    PREV_MM(2) TYPE N,
    PREV_YYYY(4) TYPE N,
    PREV_DD(2) TYPE N,
    YYYY_MM(6),
    YYYY_MM1(6),
    DATE LIKE SY-DATUM.
    DATE = SY-DATUM.
    CURR_YYYY = DATE+0(4).
    CURR_MM = DATE+4(2).
    CURR_DD = DATE+6(2).
    PREV_DD = 1.
    IF CURR_MM = '01'.
    PREV_MM = '12'.
    PREV_YYYY = CURR_YYYY - 1.
    ELSE.
    PREV_MM = CURR_MM - 1.
    PREV_YYYY = CURR_YYYY.
    ENDIF.
    concatenate PREV_YYYY PREV_MM PREV_DD into YYYY_MM.
    concatenate CURR_YYYY PREV_MM PREV_DD into YYYY_MM1.
    read table l_t_range with key
    fieldname = 'BLDAT'.
    l_idx = sy-tabix.
    l_t_range-low = YYYY_MM.
    l_t_range-high = YYYY_MM1.
    l_t_range-sign = 'I'.
    l_t_range-option = 'BT'.
    modify l_t_range index l_idx.
    p_subrc = 0.
    Mind you this code will load data from 1st of the previous month to the 1st of current month. I just don't want to load that extra "1st day" of current month data as I have 30,000 records everyday.
    Say for example, I want to load data from 1st Mar to 31st Mar or 1st Feb to 28thFeb. How should I modify the above code.
    Is there a formula to get the last date of the previous month. That's all I need. This would solve the problem.

    try this routine. it will return a range from 1st day to end of the month.
    DATA: l_s_range TYPE rsr_s_rangesid,
              E_T_RANGE TYPE  RSR_T_RANGESID.
    DATA: year(4) TYPE n,
          month(2) TYPE n,
          day(2) TYPE n,
        ld_keydate  TYPE sydatum,
          ld_lastday  TYPE sydatum.
      REFRESH e_t_range.
      CLEAR l_s_range.
      year  = sy-datum(4).
      month = sy-datum+4(2).
    *Months with 31 days in year
      IF month = '01' OR
         month = '03' OR
         month = '05' OR
         month = '07' OR
         month = '08' OR
         month = '10' OR
         month ='12'.
        day = '31'.
      ENDIF.
    *check for leap year: provoking sy-sybrc <> 0
      IF month = '02'.
        day = '29'.
        MOVE:   '02'        TO ld_keydate+4(2),
                year        TO ld_keydate(4),
                day         TO ld_keydate+6(2).
        CALL FUNCTION 'DATE_CHECK_PLAUSIBILITY'
          EXPORTING
            date   = ld_keydate
          EXCEPTIONS
            OTHERS = 1.
        IF sy-subrc <> 0.
          day = '28'.
        ENDIF.
      ENDIF.
    *months with 31 days in year
      IF month = '04' OR
         month = '06' OR
         month = '09' OR
         month = '11'.
        day = '30'.
      ENDIF.
      MOVE: year              TO ld_lastday(4),
            month             TO ld_lastday+4(2),
            day               TO ld_lastday+6(2).
      l_s_range-low  = sy-datum.
      l_s_range-high = ld_lastday.
      l_s_range-sign = 'I'.
      l_s_range-opt  = 'BT'.
      APPEND l_s_range TO e_t_range.

  • Selection in data load from infoprovider

    Hi Guys,
    In BPC NW 7.5 we have to load data from infoprovider allowing users to select data (like for dimension TIME) by BPC prompt.  We found two solutions solving partially our problem. In fact, users have to modify manually the selections.
    SOLUTION 1:
    To load data we want use the process chain /CPMB/INFOPROVIDER, but we know that it is not possible insert selections in the /CPMB/INFOPROVIDER prompt (as describe in this post Re: Package LOAD INFOPROVIDER, Select input ENTITY).
    To select data we can use an intermediate infocube as a BW workaround (as describe in this post Re: BPC 7.5: Delta Load when loading from BI InfoProvider ) to have a source with only the selected data.This could be done by a selection in the DTP between the source infocube and the intermediate infocube. This solution is not dynamic, in fact, in this case users have to modify manually the DTP selection.
    How can we allow users to insert this selection in the DTP by a BPC prompt?
    SOLUTION 2:
    To select data we can use a transformation file inserting a selection like
    SELECTION = <Dimension1_techname>,<Dimension1_value>.
    It is not dynamic, in fact, also in this case users have to modify manually the file selection.
    Do you know how to allow these selection by a BPC prompt to avoid these manual changes?
    Do you know other solutions?
    Thank you for your support.

    Hi D-Mark,
    This is definitely a place where it would be nice to see some additional functionality added to BPC. Variable replacement in the transformation file based on the data manager prompt would probably be the best thing to have in the software.
    In any case, getting back to your question, manually modifying the transformation file selection is the most common practice on BPC projects. The blog linked by Naresh is a fairly elegant way to do this, though it doesn't completely get around the fact that it's easy to forget to do and easy to get confused about what is going on in the transformation file.
    A third option that no one has mentioned is to do a SELECTION statement in the transformation file based on navigational attributes in the source InfoProvider. This approach can make the selection statement dynamic based on the contents of BW InfoObjects. Still not very user-friendly, but if you can put an automatic process in place to update the BW navigational attributes this might meet your need without having to set up an extra BW staging InfoProvider.
    The SELECTION syntax is documented here, though it doesn't mention that you can select on navigational attributes: [http://help.sap.com/saphelp_bpc75_nw/helpdata/en/5d/9a3fba600e4de29e2d165644d67bd1/frameset.htm]
    With navigational attributes (the profit center attribute of cost center, for example) it would be something like:
    SELECTION=0COST_CENTER___0PROFIT_CENTER,PC01
    Ethan

  • Essbase 9.3.1, more time taken for data load.

    Hi,
    i am trying to load 15gb of data, (data is taken from two seperate database of oracle) to my ASO application directly.
    the load is very slow, what factors should i consider to make the load faster?
    Will incresing the RAM size help me in this context?
    i have gone through the admin doc of Essbase 9.3.1, in that the Hard-Disk and the RAM requirement is given, but it is for the Block storage.
    is there any difference between this estimation and the ASO estimation?
    If anyone can guide me, what things has to be taken care while loading the cube with huge data.
    i shall be very thankful.

    The statements which matters for the Aggregate storage is
    DLSINGLETHREADPERSTAGE FALSE
    DLTHREADSPREPARE Sample Basic 3
    The write(DLTHREADSWRITE Sample Basic 4) option doesnt have any impact on the writing speed of the data in the ASO.
    The statements has to be included in the essbase.cfg file, which is the configuration file of the Essbase server.
    Well as per the document this should be done throught the maxl ,Esscmd or the Analytic services console?
    Question?
    1) Should we simply put this statements in the essbase.cfg file without a semicolon? and restart the server/application?
    2) if these statement can be executed by the maxl? Please let us know how can we do that?
    3) How do we know what is the current level of thread being used for the Read/write?
    Thanks

  • Multiple Images - selection for pre loading

    The preload image behaviour supplied within Dreamweaver is fine up to a point, but it inserts javascript files and is a pain to use if, for instance, you have 20 or so images to load. Or maybe I'm just lazy!
    It is possible to preload images using CSS2 (see http://www.thecssninja.com/css/even-better-image-preloading-with-css2 ) and in CSS3 (see http://perishablepress.com/press/2010/01/04/preload-images-css3/ )
    But none of these give you the option to browse to a folder, select a group of images from it and paste the results into the  the code.
    Has anyone found an extension that will do this?

    >>What seems to be happening the first time is the
    image isn't loaded before
    >>the fade is finished, so the image just appears after
    a short pause the
    >>first time.
    As has been said many times in thos forum - use the
    MovieClipLoader class to
    load images, and then use that class' onLoadInit method to do
    things with
    the loaded image. Loading is asynchronous - you _have_ to
    wait until the
    image is done loading before doing a fade.
    Dave -
    www.offroadfire.com
    Head Developer
    http://www.blurredistinction.com
    Adobe Community Expert
    http://www.adobe.com/communities/experts/

  • Can I use Office 2007 csv file for data load?

    Hi all,
    I see some special characters while looking at data after load.
    I am using the latest version of microsoft office.
    Does SAP BW support the csv files saved in 2007 version?
    Thanks and Regards,
    Ravi.

    Hi Ravi,
    This is another error.
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    Error Message:
    Error 1 when loading external data
    Error when opening the data file
    Errors in source system.
    Step By step analysis:
    Data Request sent off?
    Data Selection sucessfully started?
    Data se;ection sucessfully finished?
    Processing error in source system reported?
    Processing error in warehouse reported?
    Could you please guide me houw to overcome these. Thanks.

Maybe you are looking for