ODS Issue

Hi,
I have one load issue today.while activating the ODS Object Iam getting the error like
Characteristic values are not allowed if they only consist of the     
character "# " or begin with "!". If the characteristic is compounded,
this also applies to each partial key.                                                                               
You are trying to load the invalid characteristic value               
TCode-I1//Rec.Typ-CO & (hexidecimal representation                    
54436F64652D49312F2F5265632E5479702D434F ).  
Please help me how to solve this issue.
Thanks in advance.

Hi
RSKC -- type ALL_CAPITAL -- F8 (Execute)
OR
Go to SE38 and execute the program RSKC_ALLOWED_CHAR_MAINTAIN and give ALL_CAPITAL or the char you want to add.
Check the table RSALLOWEDCHAR. It should contain ALL_CAPITAL or the char you have entered.
Refer
/people/sap.user72/blog/2006/07/23/invalid-characters-in-sap-bw-3x-myths-and-reality-part-2
/people/sap.user72/blog/2006/07/08/invalid-characters-in-sap-bw-3x-myths-and-reality-part-1
/people/aaron.wang3/blog/2007/09/03/steps-of-including-one-special-characters-into-permitted-ones-in-bi
http://help.sap.com/saphelp_nw04/helpdata/en/64/e90da7a60f11d2a97100a0c9449261/frameset.htm
For adding Other characters
OSS note #173241 u2013 "Allowed characters in the BW System"
Sample cleansing routine (#)
Help loading  char EQUIP#1111#TAG#3311  SN#A01040          *     into Cube
Hope it helps

Similar Messages

  • ODS Issues

    Hello Gurus,
    This is an urgent request. I would like to connect another ODS to an existing ODS as data target. I intend to have a time reference key on the second ODS but would like the second ODS populated by data that has changed only from the ODS acting as datasource. In other words, I would like records with ORECORDMODE = 'X' and ' ' only to be transferred and not 'N'. Can someone please tell me what I need to do and also how I can have a timestamp as key to my second ODS.
    This will be rewarded with top marks.

    Hello Rue,
    In the key section of target ODS, add fields for date and time (you may use 0DATE and 0TIME). In the update rule create formula for these fields. You can map 0DATE with syst-datum and 0TIME with syst-timlo.
    When you generate export datasource, infopackages for init and delta will be automatically created by system. You can use those pacakges to initalize and load subsequent deltas.
    Hope it helps.
    Regards,
    Praveen

  • Sales Header and Item integration issues.

    Experts,
    If I integrate 2LIS_11_VAHDR and 2LIS_11_VAITM into an ODS:
    Issues:
    1) If I make my SO # as a Key Field: then I cannot capture all my item level details since they will be overwritten as the key is just SO #.
    2) If i make both SO # and SO Item # as key fields: I can capture all my item details with out any problem since VAITM will have both SO # and SO Item # but how do I capture specifically header details as header DS does not have SO Item   #
    Please help me out
    thanks

    Hi Hari,
    Can u be more specific on your requirement so that we can help u..?
    Mahendra

  • Performance issues with query input variable selection in ODS

    Hi everyone
    We've upgraded from BW 3.0B to NW04s BI using SP12.
    There is a problem encountered with input variable selection. This happens regardless of using BEx (new or old 3.x) or using RSRT. When using the F4 search help (or "Select from list" in BEx context) to list possible values, this takes forever for large ODS (containing millions of records).
    Using ST01 and SM50 to trace the code in the same query, we see a difference here:
    <u>NW04s BI SQL command</u>
    SELECT                                                                               
    "P0000"."COMP_CODE" AS "0000000032" ,"T0000"."TXTMD" AS "0000000032_TXTMD"                             
    FROM                                                                               
    ( "/BI0/PCOMP_CODE" "P0000" ) LEFT OUTER JOIN "/BI0/TCOMP_CODE" "T0000" ON  "P0000"."COMP_CODE" = "T0000
      "."COMP_CODE"                                                                               
    WHERE                                                                               
    "P0000"."OBJVERS" = 'A' AND "P0000"."COMP_CODE" IN ( SELECT "O"."COMP_CODE" AS "KEY" FROM              
      "/BI0/APY_PP_C100" "O" )                                                                               
    ORDER BY                                                                               
    "P0000"."COMP_CODE" ASC#                                                                               
    <u>BW 3.0B SQL command:</u>
    SELECT ROWNUM < 500 ....
    In 3.0B, rownum is limited to 500 and this results in a speedy, though limited query. In the new NW04s BI, this renders the selection screen unusable as ABAP dumps for timing out will occur first due to the large data volume searched using sequential read.
    It will not be feasible to create indexes for every single query selection parameter (issues with oerformance when loading, space required etc.). Is there a reason why SAP seems have fallen back on a less effective code for this?
    I have tried to change the number of selected rows to <500 in BEx settings but one must reach a responsive screen in order to get to that setting and it is not always possible or saved for the next run.
    Anyone with similar experience or can provide help on this?

    here is a reason why the F4 help on ODS was faster in BW 3.x.
    In BW 3.x the ODS did not support the read mode "Only values in
    InfoProvider". So If I compare the different SQL statements I propose
    to change the F4 mode in the InfoProvider specific properties to
    "About master data". This is the fastest F4 mode.
    As an alternative you can define indexes on your ODS to speed up F4.
    So would need a non-unique index on InfoObject 0COMP_CODE in your ODS
    Check below for insights
    https://forums.sdn.sap.com/click.jspa?searchID=6224682&messageID=2841493
    Hope it Helps
    Chetan
    @CP..

  • Authorization Issue with ODS

    Dear all,
    I have an authorization issue with two ODS.
    One I activated for BEx reporting --> Is working fine in Dev, but I get error with
    missing authorization in QUA, althought some authorizations.
    Same issue with a newly created ODS, which works in Dev, but gives an error
    with missing authorization in QUA.
    What can be the reason for this? Any input is highly appreciated!
    Cheers,
    Claudia

    Hi,
    check that the role(s) are transported from your DEV and your QA, and that the user has the correct role(s)
    Check as well in your QA transaction RSSM for your ODSs objects; it might be that by transporting the ODS, some authorizations have been applied by default.
    hope this helps...
    Olivier.

  • Error while activating the ods ( prod.issue)

    Hi BW guru's
    I am facing major issue while activating the ods i am getting this below error.
    <b>Request REQU.....,data package 000002incorrect with status 5
    Request REQU.....,data package 000001 incorrect with status 5.</b>
    If any1 has come across this issue pls help me out .
    I will reward the points surely.
    Cheers,
    Rakesh

    Hi Rakesh
    The error message
    "Key value exists in duplicate (not allowed by the ODS object type)" in the snap you sent, means you set the ODS for unique records. Thats why its not allowing a second with the same value, which is already in the ODS.
    Please check your ODS settings, if 'Unique Data Records' is checked or not.
    Sorry for my late response!
    Regards
    GSK

  • ODS Activation Issue in Prod - Critical

    We are in BI 7. We have a huge ods ZXYZ in Prod. It has more than 400 million records. BEx reporting - off.
    Over the last weekend some data was added the requests were being activated and there was master data activity in parallel. The ODS activation failed with SYSTEM_EXCEPTION. Unfortunately not investigating the root cause, the ODS activation was resumed and it returned SYSTEM_CANCELED again as there was master data activity going on.
    The problem is that request is now partially activated and it became in faulty status. When trying to activate, we get the following error message.
    <i>Activation of M records from DataStore object ZXYZ terminated
    'A' and 'U' entries exist in RSODSACTREQ for request REQU_CTAKWUIM6ZAWBQC1K6Q5TVL8M - termination</i>
    When trying to delete that request which was partially activated, it fails complaining about Faulty status of the ODS.
    The problem is we are not moving forward as the request cannot be deleted/activated due to this.
    RSODSACTREQ table has Activation status 2  - Activate failed for this request.
    Can you experts please advice.

    Hi,
    I see that your message was not answered and you may not have this issue any more. But for future reference you could refer to note 947481. You might be able to use step 1 from this note to clean the failed request and reload your data.
    Hope it will help some day. It did work for us.
    Sameer Gandhi

  • ODS activation(urgent Prodution Issue)

    Hello Gurus,
    I had loaded full upload from <b>R/3 to ODS(ODS1) </b>
    then in <b>BW system</b> ODS1 to ODS (ODSM)
    where this is the process:
    <b>R/3</b> > <i>BW</i>  <b>ODS1(full load)</b>-> <i>BW</i>  <b>ODSM (init load)</b>
    <b>
    <u>My Issue</u></b>:
    while I tried to load from <i><b>ODS1 to ODSM in BW </b></i>i am getting as error as below
    <b>Error Message</b>
    >Errors in source system
    >Error     8 when starting the extraction program
    <b>
    Details Tab</b>
    >Extraction (messages): Errors occurred
            > Error occurred in the data selection
    Where i deleted the error load in ODSM and i tried to schedule again where it gave a message
    <b>Delete init. request
    REQU_AIS45B2I7N6FU2G3WAT60NPEU before running init. again with same selection
    </b>
    Thanks will be say through points
    Regards
    Sandy

    Dear Sandy,
    You are getting error while loading from ods to another ods right?
    After loading to First ods right click say generate export datasource then the source for target ods will create then make sure that the target ods update rule is active. Just replicate the local source connection in RSA1->Src Systems.
    Then goto se38 execute RS_TRANSTRU_ACTIVATE_ALL program it will ask infosource and source system, give 8ods1(8 with ur firstods) as infosource and local src system tech name ( u can see that in rsa1->src systems. execute that program. Then delete error intial request and load the data.
    Make sure that target ods update rule is active.
    Assign points if it solves ur problem............
    Best Regards,
    SG

  • Load issue in ODS

    Dear all,
    We are facing issue in loading data in ODS.  Background time is too high.  Response time is very slow.
    The process chains are not getting started automatically.  Manual load also stuck in between.
    Please suggest some possible solution.
    Regards,
    Mohankumar.G

    Hi ,
    If u r working with BI 7,
    then goto the DTP which u r using to load ur DSO,
    in the menubar select 'goto' menu, select 'Settings for Batch Monitor' .
    in the screen change , No of processes to '9'
    and Job class to 'A'.
    Narendra Reddy

  • ODS Activation Issue

    Hi All,
    I am getting the below error while activating the ODS.  I have searched in SDN but not able to find the solution for this.
    Error when inserting the data record for data package                                                   8
    Message no. RSODSO_PROCESSING002
    Diagnosis
    Failed to store data records persistently. Possible causes are:
    You tried to save a record that is already stored in the database.
    This error only occurs with DataStore objects with option "Unique Data Records".
    A unique index was created for the table of active data.
    You can check this in the Data Warehousing Workbench by choosing "Display" for the DataStore object and opening the "Indexes" folder.
    There is a technical problem on the database.
    But In DSO "Unique records" is unchecked and Nothing in Indexes folder to delete.
    This DSO is getting data from from 2 datasources which are flat files. I am able to load and activate data from second datasource but when I loading and activating the data its giving the above issue.
    So can anyone let me know the resolvation for this issue.
    Regards
    Sankar

    Hello Punitha,
    Thanks for your reply.  But I am not able to see any Char name in the log.
    Please find the below log.
    Error when inserting the data record for data package                                                 9
    Process 000009 returned with errors
    Error when inserting the data record for data package                                                10
    Process 000010 returned with errors
    Error when inserting the data record for data package                                                11
    Process 000011 returned with errors
    Data pkgs 000001; Added records 1-; Changed records 0; Deleted records 0
    Data pkgs 000002; Added records 1-; Changed records 0; Deleted records 0
    Data pkgs 000003; Added records 1-; Changed records 0; Deleted records 0
    Data pkgs 000004; Added records 1-; Changed records 0; Deleted records 0
    Data pkgs 000005; Added records 1-; Changed records 0; Deleted records 0
    Data pkgs 000006; Added records 1-; Changed records 0; Deleted records 0
    Data pkgs 000007; Added records 1-; Changed records 0; Deleted records 0
    Data pkgs 000008; Added records 1-; Changed records 0; Deleted records 0
    Log for activation request ODSR_4H47N3DWSG0OESY8S7V9DDPA4 data package 000001...000008
    RSS2_DTP_RNR_SUBSEQ_PROC_SET GET_INSTANCE_FOR_RNR       498 LINE 43
    RSS2_DTP_RNR_SUBSEQ_PROC_SET GET_TSTATE_FOR_RNR 7 LINE 330
    Status transition 7 / 7 to A / A completed successfully
    RSS2_DTP_RNR_SUBSEQ_PROC_SET SET_TSTATE_FURTHER_ERROR_OK LINE 367
    Errors occured when carrying out activation
    Please take a look and suggest me the better wayout to resolve this.
    Regards
    Sankar

  • ODS performance issue.

    We have an ODS which has been facing severe performance issues. The reason is mostly due to high number of writes during data capture from the mainframe, volume of data and the huge increase in the number of people/applications querying the database for operational planning, statistical analysis etc. One thing I noticed about this database is that there are a huge number of unused fields in each table. Some of the fields are not used at all, sometimes the data capture operation writes 30 or 40 whitespace to the field (no idea why). My question is : Will dropping these unnecessary fields improve performance ?
    Edited by: Crusoe on Sep 18, 2012 2:00 PM

    If we tell them to drop those 15 unused fields will the database performance improve significantly ?Why guess?
    How can we know?
    Further root cause analysis is required.
    If you have system-wide performance problems, use a system-wide report like AWR or Statspack to investigate what the biggest issues are.
    See http://jonathanlewis.wordpress.com/statspack-examples/ for advice on interpreting these reports.
    For example if I have a single table with 20 fields and only 5 of them hold any meaningful data and the remaining 15 are just garbage. Let's say 10 hold null values and remaining contain 5 whitespace of differing length and there are about 70 write transactions per second. If we tell them to drop those 15 unused fields will the database performance improve significantly ?Who knows?
    Will it have some effect? Probably?
    Is it relevant to your issues? Who knows.
    Are you even likely to notice the effect? I would GUESS not.
    The reason is mostly due to high number of writes during data capture from the mainframe, volume of data and the huge increase in the number of people/applications querying the database for operational planning, statistical analysis etcZoom in on these areas individually
    Trace these parts of the application.
    What are these areas of the application waiting on?
    Are there common themes to the issues in these specific areas?
    Is it inadequate resources? Increasing hardware may not be relevant to your issue.
    In fact, just upgrading hardware can make things worse.
    Is it specific contention?
    Is it bad design leading to inefficient ways to access the required data?
    Do you use partitioning?
    Have you read "scaling to infinity" by Tim Gorman? http://www.evdbt.com/papers.htm
    Is partitioning even relevant to your issues?
    Who knows?
    Not us. Yet.
    Oracle is very well instrumented.
    There should be no need to guess and throw money at hardware before you know what the problem is.

  • Authorization issue when I display data from ODS, Infocube, Multirprovider

    Hi Experts,
    When I'm trying to display data for ODS, Infocube, Multiprovider, Infoset in production system, facing aurhorization issue.
    Can anybody have idea what is authorization objects to display data from Infoproviders.
    SIRI

    Check for below authoriztions in your role:
    S_RS_ICUBE
    ACTVT          03
    RSICUBEOBJ     AGGREGATE, CHAVLREL, DATA, DATASLICE,   DEFINITION, EXPORTISRC, UPDATERULE
    RSINFOAREA     *
    RSINFOCUBE     <your cubes>
    S_RS_ODSO
    ACTVT          03
    RSINFOAREA     *
    RSODSOBJ     <your DSOs>
    RSODSPART     DATA, DEFINITION

  • Loading Issue in ODS

    Hi,
    I  have a lot of data getting loaded into ODS and the data gets stuck with material. i mean it will take a long time to get loaded
    looks like performance issue in loading. please let me know how do i proceed
    Sandy

    Hi
    I think you need to create Secondary Index in your ODS. in case if the data is more in volume specially for example Document number ,there will be more records. here in this case material are getting loaded slow.
    Try to Create a index on ODS
    Under Indexes, you can create secondary indexes by using the context menu in order to improve the load and query performance of the ODS object. Primary indexes are created automatically by the system.
    If the values in the index fields uniquely identify each record in the table, select Unique Index from the dialog box.  The description of the indexes is specified by the system. To create a folder for the indexes, choose Continue from the dialog box. You can transfer the required key fields into the index folder using Drag&Drop.
    You can create a maximum of 16 secondary indexes. These are also transported automatically.
    hope this helps
    Santosh

  • Data Issues in ODS

    Hi Guru's
    I want your ideas to slove the issue that came to me.....
    I am loading Flat file data to ODS(3.5)...
    for example :
    Keyfields: Material,customer,sourcesystem.
    Data fields: Product, price, productcomments, price comments.
    In file 1 :
    Material  Customer Sourcesystem, Poduct, productcomment
    1001     ,  xyz       ,  pp                ,  Book  , good
    In file 2:
    Material  Customer Sourcesystem, Price, pricecomment
    1001     ,  xyz       ,  pp                , 12.00, More
    when i loaded File 1 to ODS i can see the data as
    1001,xyz,pp,Book,Good,
    When i loaded File 2 to ODS i am seeing as
    1001,xyz,pp,Book,12.00,- ,More.
    My Issue is i want to see like
    1001,xyz,pp,Book,Good,12.00,good, More.
    In ODS i changed KeyFiq as Addition...But Productcomment and pricecomments are Char's
    Can anyone give me idea how to do it
    Regards
    Rao VS

    hi,
    make the data fields as overwrite and load.
    if you need data for keyfigure as addition, choose a cube that get the data from this ods and load delta data to that cube from this ods.
    so load file1,file2 to ods then a delta load to cube(do delta after the init load).
    Ramesh

  • Issue in loading ODS to CUBE(URGENT)

    Hi All,
    We are facing an issue in loading AR,AP and G/L ODS to Cube.
    The steps we done are
    1) Repair full request is done in ODS got around 35 lack records.
    2) Done Full load in Cube n got 35 lack records.
    3) Done delta load in ODS got 189 records.
    4) Done delta load in Cube got 1 lack records.
    This is where we guys couldnt find out what went wrong??
    1 lack records to cube is not possible...??
    Did we went wrong any where..??
    Could any one guide us to do What has to be done??
    Kind Regards,
    Shanbagavalli.S

    Hi,
    Thanks for the response.
    Already delta is done in the Cube and ODS.
    No we didn't do intialization . We did repair full in ODS and full update to cube,.
    What we have done is
    We planned to do the selective deletion from G/L cube for Q2 2007.
    Done a repair full request to ODS
    From ODS to Cube a full load. (as we have deleted the data from the cube)
    Problem is after doing this it has pulled data for Saturday, Sunday, Monday and Tuesday.(past five days)
    But the problem is wilth delta.. it is still pulling only 189 same records that got pulled in yesterday evening data load
    Can you tell me wat are all the methods to load those days records also delta records to CUBE.
    Thanks in advance.
    shanba
    Message was edited by:
            shanbagavalli shivasankaran
    Message was edited by:
            shanbagavalli shivasankaran

Maybe you are looking for

  • Server Error in '/' Application. [COMException (0x81020024): Unable to connect to database]

    Hi, My team and I are trying to resolve this issue. some of our users are having trouble uploading data into the database. this is the error that they are getting. I would really appreciate it if you could help me get ridd fo this issue. Server Error

  • Can I use attunity CDC to propagate changes from sql server to a sybase database

    can i use attunity cdc components in sql server to propagate changes to a sybase destination. I did some experimenting with cdc in ssis 2012 and got it working to send changes to sybase, but I need to do this for approx 5000 tables. so looking for a

  • Does Sountrack Pro have a plug-in similar to "Sound Soap'?

    Long film. Lot of difficult interview material ... noise noise. Would like to be able to do fixes while doing a final mix in Soundtrack Pro - without having to scrub all interview material with Sound Soap before hand. Is there any effect plug-in, or

  • Locking a folder

    my iWeb site files are on my iDisk in my Public folder. i want to keep my files and folders as safe as possible without password protecting the site. would it help to lock the folders? i'm thinking it might help because that way when someone follows

  • Multiple downloads of same album.

    Twice now, when I am downloading a album I bought off itunes it downloads 3 or more copies of the same album.  This has happened twice now with music I ordered in advance of release.  Anyone else have this problem?  I don't really mind having to dele