Load Material Data using 0MATERIAL_ATTR

Hello,
I have replicated the DS 0MATERIAL_ATTR  from R3 to my BI system.
When I try to create Transfer rules using infosource 0MATERIAL_ATTR it shows the error  message "InfoObject 0MATERIAL_ATTR is not available in version D".
I observed that the Infosource 0MATERIAL_ATTR is not 3.x infosource.
Is this problem bcoz the DS 3.x and not the infosource.
How can I fix this?Any help please.

Hi,
Try to find your datasource in transaction rsa1->datasources, and delete it.
The try to replicate it again but when asked how do you want to replicate it, choose in BW 3.x version.
Then you can map it normally in the transfer rules.
Diogo.

Similar Messages

  • TSV_TNEW_PAGE_ALLOC_FAILED error while loading the DATA using DTP

    Hi,
    While loading the data using DTP for 2  DSO's we are gettig the error
    TSV_TNEW_PAGE_ALLOC_FAILED
    can any one kindly help me out regarding the same.
    Thank You,
    Poornima.

    Hi Soundarya,
    Thanks a lot for the reply. But i found that its running fine in development, where as coming to quality its throwing an error. These happened for Two DSO's. In both the transformations i have identified that the Transformation names are different from Development and Quality..
    There are no routines written for them and no select statements have been used
    Can you please suggest me regarding the same.
    Edited by: Poornima Gayatri on Mar 22, 2010 7:00 AM

  • Loading FA data using ORACLE WEB ADI brings application down.

    Hi,
    Can anyone suggest on the below issue.
    Loading FA data using ORACLE WEB ADI brings application down.
    Regards
    Ketan.

    Application Release 12.0.6
    Database version 10.2.0.2
    The following error has occurred
    Exception Name: java.lang.NullPointerException
    Stack Trace: java.lang.NullPointerException at oracle.apps.fnd.common.WebAppsContext.doValidateSession(WebAppsContext.java:1425) at oracle.apps.fnd.common.WebAppsContext.validateSession(WebAppsContext.java:1775) at oracle.apps.bne.framework.BneOracleWebAppsContext.isValidateIcxSessionTicket(BneOracleWebAppsContext.java:2258) at oracle.apps.bne.framework.BneOracleWebAppsContext.setupContext(BneOracleWebAppsContext.java:1511) at oracle.apps.bne.framework.BneAbstractWebAppsContext.getContext(BneAbstractWebAppsContext.java:207) at oracle.apps.bne.framework.BneBaseBajaContext.getBneWebAppsContext(BneBaseBajaContext.java:191) at oracle.apps.bne.framework.BneBajaServlet.doRequest(BneBajaServlet.java:151) at oracle.apps.bne.framework.BneBaseServlet.doPost(BneBaseServlet.java:95) at javax.servlet.http.HttpServlet.service(HttpServlet.java:763) at javax.servlet.http.HttpServlet.service(HttpServlet.java:856) at com.evermind.server.http.ResourceFilterChain.doFilter(ResourceFilterChain.java:64) at oracle.apps.jtf.base.session.ReleaseResFilter.doFilter(ReleaseResFilter.java:26) at com.evermind.server.http.EvermindFilterChain.doFilter(EvermindFilterChain.java:15) at oracle.apps.fnd.security.AppsServletFilter.doFilter(AppsServletFilter.java:318) at com.evermind.server.http.ServletRequestDispatcher.invoke(ServletRequestDispatcher.java:627) at com.evermind.server.http.ServletRequestDispatcher.forwardInternal(ServletRequestDispatcher.java:376) at com.evermind.server.http.HttpRequestHandler.doProcessRequest(HttpRequestHandler.java:870) at com.evermind.server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:451) at com.evermind.server.http.AJPRequestHandler.run(AJPRequestHandler.java:299) at com.evermind.server.http.AJPRequestHandler.run(AJPRequestHandler.java:187) at oracle.oc4j.network.ServerSocketReadHandler$SafeRunnable.run(ServerSocketReadHandler.java:260) at com.evermind.util.ReleasableResourcePooledExecutor$MyWorker.run(ReleasableResourcePooledExecutor.java:303) at java.lang.Thread.run(Thread.java:595)
    Sometimes we are getting
    Fatal error FND_MESSAGE : max no of open file descriptors.
    In alert log file it is showing
    WARNING:Oracle instance running on a system with low open file descriptor
    limit. Tune your system to increase this limit to avoid
    severe performance degradation.
    Not able to understand exactly where the issue is.
    Regards

  • Load Material Data in the Material Master

    Hi All,
      We have the Material Ledger Active at the Material Level. We are using ECC 6.0 with the highest enhancement pack.
       The requirement is to load the Moving average prices for the Company code currency and Group Currency as a part of
       Material Master Data Migration.
       I am using a the business object BUS1001006's SAVEDATA method (Basic type MATMAS_BAPI03) in a LSMW to achieve
       this.
       This IDOC does not provide me the structure or fields to load the Material Ledger Data. I tried passing the price to the
      MBEW-VERPR however the price is not updated.
       One workaround is to use MR21 however that would be a seperate program which we want to avoid.
       Has anyone done this before. Will greatly appreciate any help/suggestions.
       Thank you for your time.
    Regards,
    Vamsee

    Yes, I am loading from the text file.
    My Zmaterial is defined as InfoObject with text data language dependent.
    Now I want to load this master data through flat file.
    I created FlatFile under source System and then created a DataSource. Under Extraction tab I selected the physical location of my flat file with csv separator and all. Then under proposal I saw preview of my data. Later under Fields tab, For Language I selected 0HTLANG
    For Materialid I selected Zmaterial (defined by me)
    But now there is one more field remaining MAterial Description, across that field what Template InfoObject should I put? Thats my confusion.
    After this I can click on Preview and then load that data in PSA thru infopackage. then later into InfoObject.
    Please inform.
    thnx

  • Unable to load CSV data using APEX Data Load using Firefox/Safari on a MAC

    I have APEX installed on a Windows XP machine connected to an 11g database on the same Windows XP machine.
    While on the windows XP, using IE 7, I am able to successfully load a CSV spreadsheet of data using the APEX Data Load utility.
    However, if I switch to my MacBook Pro running OS X leopard, then login into same APEX machine using Firefox 2 or 3 or Safari 3, then try to upload CSV data, it fails on the "Table Properties" step when it asks you for the name of the new table and then asks you to set table properties, the table properties just never appear (they do appear in IE 7 on Windows XP) and if you try to hit the NEXT button, you get error message: "1 error has occurred. At least one column must be specified to include in new table." and of course, you can't specify the any of the columns because there is nothing under SET TABLE PROPERTIES in the interface.
    I also tried to load data with Firefox 2, Firefox 3 (beta), and Safari 3.1, but get same failed result on all three. If I return to the Windows XP machine and use IE 7.0, Data Load works just fine. I work in an ALL MAC environment, it was difficult to get a windows machine into my workplace, and all my end users will be using MACs. There is no current version of IE for the MAC, so I have to use Firefox or Safari.
    Is there some option in Firefox or Safari that I can turn on so this Data Load feature will work on the MAC?
    Thanks for your help. Any assistance appreciated.
    Tony

    I managed to get this to work by saving the CSV file as Windows CSV (not DOS CSV), which allowed the CSV data to be read by Oracle running on Windows XP. I think the problem had to do with different character sets being used for CSV on MAC versus CSV on Windows. Maybe if I had created my windows XP Oracle database with Unicode as the default character set, I never would have experienced this problem.

  • How to load the data using a plsql table in ODI.

    Hi All ,
    Can anyone help me on this ?
    We have a PLSQL procedure which returns a plsql table as out parameter.
    We are supposed to load the data in to a file using this plsql table (Table type) in ODI.
    Can this be done using ODI?
    Regards,
    Karthik+

    Hi,
    We have one process with a ref cursor (Oracle) as a source and remote Oracle DB as a target. I ended up writing my own KM that populates a global temporary table from the cursor first and then transfers the data to target. If temporary table is an option for you, the rest is pretty easy.
    Regards.

  • Deleting master data after loading transactional data using flat file

    Dear All,
    I have loaded transaction data  into an infocube using a flat file . While loading DTP i have checked the  option "load  transactional data with out master data exists" . So transactional data is loaded even if no master data is there in BW.
    While loading the flat file, I made a mistake for DIVISION Characteristic  where original master data value is '04000' , but i loaded the transactional data with value '4000' .Then i later realized after seeing the data from the infocube and deleted the request. then i reloaded data with value '04000'. Till now every thing is fine.
    But when I see the master data for DIVISION , i can see a new entry  with value '4000'.
    My question is how to delete the entry value '4000' from DIVISION. I tried deleting manually this entry from 'maintaining masterdata' , but it is not allowing me to do so .
    I have also checked if any transactional data exists for that value '4000' , as i said earlier I have deleted the transactional data with that values. even tried to delete the entries from the master data table, but i donot see a option to delete entries there.
    Please suggest me on this.
    Regards,
    Veera

    Hi,
    Goto RSA1 right click on the Info object and select Delete Master data. This will delete the master data unused existing in the table.
    If this master data is not used any where else just delete the master data completely with SID option.
    If even this doesnt work you can delete the complete table entire in SE14. But this will wipe out the entire table. Be sure if you wanna do this.
    Hope this helps
    Akhan.

  • Loading Master data using DTP's

    Hi,
            I have got a problem with loading master data,we use DTP's for loading it,but the requests were failing due to prior incorrect requests for that master data infoobject,inspite of the fact that Master data always overwrites and also the DTP's were full update enabled.
             To load successfully we need to go and delete the previous requests all the time.
    I think there must be some way of solving this problem,help from you all will be really appreciated.
    Regards,
    G.Monica Roja Flora.

    yeah you would need to delete the incorrect requests before you can load again. Though the master data is overwrite, with the incorrect request the status can never be A.

  • Load master data using input schedules

    Hi all,
    Does there exist a way to load and modify master data using input schedules?
    Regards,
    Nithya

    Hi Nithyapriya,
    Chiming in here to 2nd Joost's statement above.  As of BPC 5.1 version, it is not possible to update master data from an Input schedule.
    Whether that's possible in other products or with BPC 7.0 NW (currently in Ramp-up), is another story.  For now, the general answer is that it's not possible.
    This is a known area for improvement with BPC at this time, and we're sure to hear more about either integration with NetWeaver or some other methods of making things more dynamic, or with SSIS packages, etc.. in the future.
    Edited by: garrett tedeman on Sep 2, 2008 7:20 PM

  • Loading transaction data using an attribute field in the BW cube

    I am trying to load transaction data into BPC.  All but one of my dimensions are referencing technical fields.  One of my fields is referencing an attribute of a technical field.  The technical field is called 0MAT_PLANT and the attribute of that field is 0PROFIT_CTR.  In my mapping section of my transaction file I set the PROFIT_CTR dimension to 0MAT_PLANT__0PROFIT_CTR.  The transaction file validates and processes fine but when I try to import I get the error message: 0MAT_PLANT__0PROFIT_CTR is not a valid command or column 0MAT_PLANT__0PROFIT_CTR does not exist in source
    Does anyone know how to complete this successfully?
    THANK YOU!
    Karen B. Thibodeau

    Hi,
    If is mapped fine and still getting same errors, maybe you can try to check if the IO attribute it's checked in the options in the data package.
    Regards

  • Error while loading external data using Business Content

    Hi,
    I am new to BW and I'm stuck up with this problem.
    I'm trying to load data from external file using business content cube(0PUR_C01)and infosources(2LIS_02_ITM/S012/SCL).
    I first installed the above mentioned cube and data sources.
    Then I activated the update rules and transfer rules, without any changes.(I don't know what changes might be required!)
    Monitor is showing successful load but the cube has 0 enteries.
    Even the simulation of update is showing data in the data target.
    Can someone please guide me through this ?
    Its really urgent....
    Thank you all in advance.

    Hi Sharmistha,
    Welcome to SDN !! Does your monitor shows any records extracted till PSA ? Go to the details tab -> processing and click on each node to check what monitor shows.
    Hope it helps.
    Thx,
    Soumya

  • Updation of material data using a function module

    HI
      i want to update fields in the mvke table,but the field is a Customer field in the table, not sap table field.i want to update the field using a program , not going to mm02. which function can i use for the updation.There is a function module MATERIAL_MAINTAIN_DARK for updation,will this funtion module update the customer fields also. Can anyone tell me about this function module and how to use this function module or MATERIAL_MAINTAIN_DARK.

    Using the function Module BAPI_MATERIAL_SAVEDATA , you can also update the custom fields.
    This is the documentation for the extension fields:
    <b>Reference Structure for BAPI Parameters EXTENSIONIN/EXTENSIO
    Description
        You use this structure to transfer the material's customer-defined
        fields. For information on transferring these fields, see the function
        module documentation.
    Note
        Besides the table fields already defined, customer-defined table fields
        can also be supplied with data. Since these fields are created by the
        customer, they are known only during the runtime and must therefore be
        determined dynamically.
        The structures BAPI_TE_<NAME> (<NAME> = MARA, MARC, MARD, MBEW, MLGN,
        MLGT, MVKE) and the relevant checkbox structures BAPI_TE_<NAME> (<NAME>
        = MARAX, etc.) must first be extended by the customer to include the
        fields required. The standard structures contain only the corresponding
        key fields. When including new fields in these structures, make sure
        that the field has the same name as the field in the database table. In
        addition, the fields in the structures BAPI_TE_<NAME> may only be of the
        type CHARACTER. The data element BAPIUPDATE must be used for the fields
        in the checkbox structure (except for key fields).
        The two parameters EXTENSIONIN and EXTENSIONINX are used to transfer the
        data to the method. The field STRUCTURE contains the name of the
        structure (for example, BAPI_TE_MARA or BAPI_TE_MARAX) used to identify
        the work area (for example, WA_BAPI_TE_MARA or WA_BAPI_TE_MARAX) to
        which the data is transferred. The remaining fields for the parameter
        EXTENSIONIN or EXTENSIONINX contain the data for the key fields (for
        example, the material number) and the data for the customer-defined
        fields. The number of characters reserved in the two parameters for the
        content of a customer-defined field must be the same as the number of
        characters for the corresponding work area field. If the number of
    characters required is smaller, the remaining characters in the two
    parameters must be filled with blanks. Only then may the content of
    another field be transferred. Here too, remember that the data is
    written to the database only if the corresponding indicator has been set
    in the work area.</b>
    Regards,
    ravi

  • Prolblems on Loading the data using DataMovement

    Hello Sir,
    I have created a .dat file and want to load in the table which is created by user.
    Now using the Data Movement when i gives the username and password for host credentials this error is popuped.
    install_driver(Oracle) failed: Can't load 'D:/app/Metamorph/product/11.1.0/db_1/perl/site/5.8.3/lib/MSWin32-x86-multi-thread/auto/DBD/Oracle/Oracle.dll' for module DBD::Oracle: load_file:The specified module could not be found at D:/app/Metamorph/product/11.1.0/db_1/perl/5.8.3/lib/MSWin32-x86-multi-thread/DynaLoader.pm line 229. at (eval 10) line 3 Compilation failed in require at (eval 10) line 3. Perhaps a required shared library or dll isn't installed where expected at D:/app/Metamorph/product/11.1.0/db_1/perl/site/5.8.3/lib/MSWin32-x86-multi-thread/Oraperl.pm line 58 Compilation failed in require at D:\app\Metamorph\product\11.1.0\db_1\sysman\admin\scripts/emd_common.pl line 37. BEGIN failed--compilation aborted at D:\app\Metamorph\product\11.1.0\db_1\sysman\admin\scripts/emd_common.pl line 37. Compilation failed in require at D:\app\Metamorph\product\11.1.0\db_1/sysman/admin/scripts/db/db_common.pl line 73. Compilation failed in require at - line 1.
    how to prceed can i get the guide lines.
    Thanks and Regards,
    Binney

    Hmmm I'm not sure that I know what is "DataMovement" but as I searched metalink and found a doc that says you sould use oracle shipped perl binary instead of any other one (perl from OS). So check that $ORACLE_HOME/perl/bin/perl used by your scripts if you want to use oracle supplied perl modules and tools.
    BTW why don't you use SQL*Loader instead of other tools to load data into database...

  • Using sqlldr to load CLOB data from DB2

    I am stuck trying to resolve this problem. I am migrating data from DB2 to Oracle. I used DB2 export to extract the data specifying lobsinfile clause. This created all the CLOB data in one file. So a typical record has a column with a reference to the CLOB data. "OUTFILE.001.lob.0.2880/". where OUTFILE.001.lob is the name specified in the export command and 0 is the starting position in the file and 2880 is the length of the first CLOB.
    When I try to load this data using sqlldr I'm getting a file not found.
    The control file looks something like this:
    clob_1 FILLER char(100),
    "DETAILS" LOBFILE(clob_1) TERMINATED BY EOF,
    I'm using Oracle 11gR2 and DB2 9.7.5
    Your help is appreciated.

    OK..here are additional details. Some names have changed but the idea is the same. Also the sqlldr is executing in the same directory as the data files and the control file
    Primary data file is VOIPCACHE.dat Secondary datafile (file with lob data) is VOIPCACHE.001.lob
    Control Fileload data
    infile 'VOIPCACHE.dat'
    badfile 'VOIPCACHE.bad'
    discardfile 'VOIPCACHE.dsc'
    replace into table VOIPCACHE
    fields terminated by ',' optionally enclosed by '"' TRAILING NULLCOLS
    (KEY1 "rtrim(:KEY1)",
    FIELD8,
    clob_1 FILLER char (100),
    "DATA" LOBFILE(clob_1) TERMINATED BY EOF)
    Snippet from Log file
    IELD7 NEXT * , O(") CHARACTER
    FIELD8 NEXT * , O(") CHARACTER
    CLOB_1 NEXT 100 , O(") CHARACTER
    (FILLER FIELD)
    "DATA" DERIVED * EOF CHARACTER
    Dynamic LOBFILE. Filename in field CLOB_1
    SQL*Loader-502: unable to open data file 'VOIPCACHE.001.lob.0.0/' for field "DATA" table VOIPCACHE
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file 'VOIPCACHE.001.lob.0.47/' for field "DATA" table VOIPCACHE
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file 'VOIPCACHE.001.lob.47.47/' for field "DATA" table VOIPCACHE
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file 'VOIPCACHE.001.lob.94.58/' for field "DATA" table VOIPCACHE
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file 'VOIPCACHE.001.lob.152.58/' for field "DATA" table VOIPCACHE
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file 'VOIPCACHE.001.lob.210.206/' for field "DATA" table VOIPCACHE
    This is repeated for each record
    sqlldr command
    sqlldr userid=${SCHEMA}/${PASSWD}@$ORACLE_SID control=${CTLDIR}/${tbl}.ctl log=${LOGDIR}/${tbl}.log direct=true errors=50
    I dont think the variables are important here
    -EC

  • Loading Data Using Outline Load Utility - Error

    Trying to load some data using Outline Load Utility. I followed the Oracle documentation on this and Under the Administration -> Manage Data Load area I defined Accounts as data load dimension, and period as driver dimension. I then added Jan - Feb as driver members. I wanted to load data for Jan. My data header looks like this:
    Accounts,Jan,Point-of-View,Data Load Cube Name
    acct1, 768, "2010, Ver1, Actuals, entity1, etc" Plan1
    The command i typed is this:
    C:\Oracle\Middleware\user_projects\epmsystem1\Planning\planning1>outlineload /A:Plan1 /U:Admin /M /I:C:\MockData2.txt /D:Accounts /L:C:\dataload.log /X:c:\dataload.exc
    I get the following errors:
    [Mon Nov 22 11:03:02 CST 2010] Unrecognized column header value "Jan".
    [Mon Nov 22 11:03:02 CST 2010] Unrecognized column header value "Point-of-View".
    [Mon Nov 22 11:03:02 CST 2010] Unrecognized column header value "Data Load Cube Name".
    [Mon Nov 22 11:03:02 CST 2010]Unable to obtain dimension information and/or perform a data load: Unrecognized column header value(s), refer to previous messages. (Note: column header values are case sensitive.)
    [Mon Nov 22 11:03:02 CST 2010]Planning Outline data store load process finished with exceptions: not all input records were read due to errors (or an empty input file). 0 data records were read, 0 data records were processed, 0 were successfully loaded, 0 were rejected.
    This is version 11.1.2. What am I doing wrong here? I also find it interesting that the command for data load and metadata load is the same as per oracle docs. I guess Planning knows if we're trying to load data or metadata based on the CSV file header?

    I don't usually bother with loading data using the outline load utility but as a test on 11.1.2 using the planning sample application I gave it a quick go.
    In planning, went to Administration > Data Load Settings > Picked Account as the data load dimension, Period as the driver dimension and selected Jan as a member
    I created a file :-
    Account,Jan,Point-of-View,Data Load Cube Name
    TestMember,100,"Local,E05,Actual,NoSegment,Working,FY11",Consol
    And used the following command line from the directory with the planning utilities
    OutlineLoad /A:plansamp /U:admin /M /N /I:F:/temp/dload.csv /D:Account /L:F:/temp/outlineLoad.log /X:F:/temp/outlineLoad.exc
    The log produced :-
    [Tue Nov 23 10:02:01 GMT 2010]Successfully located and opened input file "F:\temp\dload.csv".
    [Tue Nov 23 10:02:01 GMT 2010]Header record fields: Account, Jan, Point-of-View, Data Load Cube Name
    [Tue Nov 23 10:02:01 GMT 2010]Located and using "Account" dimension for loading data in "plansamp" application.
    [Tue Nov 23 10:02:01 GMT 2010]Load dimension "Account" has been unlocked successfully.
    [Tue Nov 23 10:02:01 GMT 2010]A cube refresh operation will not be performed.
    [Tue Nov 23 10:02:01 GMT 2010]Create security filters operation will not be performed.
    [Tue Nov 23 10:02:01 GMT 2010]Examine the Essbase log files for status if Essbase data was loaded.
    [Tue Nov 23 10:02:01 GMT 2010]Planning Outline load process finished (with no data load as specified (/N)). 1 data record was read, 1 data record was processed, 1 was accepted, 0 were rejected.
    There you go no problems.
    Cheers
    John
    http://john-goodwin.blogspot.com/

Maybe you are looking for