Problem with the master data restriction

Hi All,
I have a master data 0EMPLOYEE. The requirement is such that we need to restrict the future dated records i.e BEGINNING DATE <= SYSTEM -DATE. This we need to do at the extractor level which was done successfully. Now even after refreshing the master data 0EMPLOYEE in BW, I see the future dated records. May be these are the earlier records which were pulled.
I don't want to see these records in the cube. Obviously I need to put the same logic in cube. But I need to know where should I write this code. I want to know the BEGINING DATE field is assigned to what field in the cube?
Every where the assignment is of type "Master Data Attribute of".
Thanks,
Srinivas

Hi,
after doing at the extractot level,you need to replicate and then activate the DS.
Then run the InfoPackage with repair full req and then run delta DTP. It should solve ur data issue.
For Cube logic,
You need to declare the data declaration and select stmt in start routine and then write the code in indiv level routine.
check which field have u mapped for the masterdata and i think you can use cal day for beginning date but i'm not sure on this.
Wait for other's suggestion.
Cheers,
shana
assign pts if it is useful

Similar Messages

  • Problem with the master data loads

    Hi Guys,
                 I am trying to load 0MATERIAL_ATTR in QA,but it is throwing error(Record 2 :0MATERIAL : Data record 2 ('000000000000000012 '): Version '000000000000000012 ' is not va ) and also similar error is coming when i load for 0VERSION(Record 1 :0VERSION : Data record 1 ('000E '): Version '000 ' is not valid ).
    but for the same master data object in development with the same R/3 data the loads are successful,only in QA it is giving the problem.

    Hi Abdul,
    Check if the data you are getting for the two fields in your data loading are correct. I think it has some illegal characters(I guess it has a trailing space). If possible try and correct it in the source system itself.
    If you are doing full load for the master data, everytime the system is going to pull these values in and throw error. One other way of avoiding it will be to maintain routines to eliminate these.
    The quickfix will be to correct it at the PSA level and load it again.
    Regards.

  • How can I debug a problem with the 3g data connection on my iPhone 4s?

    This question didn't get any answers in the "Using iPhone" forum. Thought I would re-ask here. Note, I had to reset the network again today 2 days after the last reset:
    I've been having intermittent problems with my 3g data connection on my iPhone 4s on ATT. Whenever this happens I see full bars etc, but can't browse from safari or get any data in other apps. My wife's iPhone 4 right next to me has no issues at all. Switching on/off airplane mode and turning off/on Cellular Data does not help. The only thing that helps is resetting the Network Settings or hard restarting the phone.
    I figured my problems were hardware related so I took my phone in for a replacement the other day. Things started out fine but after I upgraded it to 5.01 and restored my backup and now I am back to having the same issue today. I assume the issue is with something coming from my backup / restore but is there any way for me to pinpoint what could be causing the issue. I have a lot of apps, I'd rather not have to reinstall and set them all backup manually, but I would if I knew that would fix the problem... but for all I know one of the apps could be causing it.
    Anyone have any logs I can check or testing I can do on my end?
    Thanks!

    Unfortunately, Apple does not make any logs available to the average schmoe (like us) for networking, kernel dumps, or anything else (assuming, of course, it is not jailbroken).
    Your best bet is to take it back into the store. Ask them to look at the notes in GCRM regarding the last time this occured, then explain after upgrading to 5.01 it is occuring again. Then ask them for a phone swap since it is a reoccuring issue and doesn't sound like an OS/SW issue. Good luck!

  • Javacomm: problem with the receive data

    Hello to all,
    I have a problem on receiving data from a serial port.
    As I figured out The port is obtained by the program I post
    here bellow..but I can't receive data.
    Do you have any hints?
    Thanks in advance.
    Mandy
    import javax.comm.*;
    import java.io.*;
    public class SerialRead extends Thread implements SerialPortEventListener
         private InputStream in;
         public SerialRead(String port)
              try
                   CommPortIdentifier id_porta;
                   id_porta = CommPortIdentifier.getPortIdentifier(port);
                   CommPort porta = id_porta.open("Read from serial porta", 200);
                   in = porta.getInputStream();
                   SerialPort porta_seriale = (SerialPort)porta;
                   porta_seriale.setSerialPortParams(9600, SerialPort.DATABITS_8,
                        SerialPort.STOPBITS_1, SerialPort.PARITY_NONE);
                   porta_seriale.addEventListener(this);
                   this.start();
              catch (NoSuchPortException ne)
                   System.out.println("The port " + port + "is not  present");
              catch (PortInUseException pe)
                   System.out.println("The port " + port + "is used by " + pe.
                        currentOwner);
              catch (UnsupportedCommOperationException ue)
                   System.out.println(
                        "The porta does not support the properties defined");
              catch (java.util.TooManyListenersException tme)
                   System.out.println(
                        "Ther can be only one owner of the port");
              catch (IOException ioe)
                   System.out.println("Error ofi IO");
         public void run()
              try
                   Thread.sleep(10000);
              catch (InterruptedException e)
         public void serialEvent(SerialPortEvent event)
              System.out.println("1");
             switch ( event.getEventType() )
                   case SerialPortEvent.BI :
                   case SerialPortEvent.OE :
                   case SerialPortEvent.FE :
                   case SerialPortEvent.PE :
                   case SerialPortEvent.CD :
                   case SerialPortEvent.CTS :
                   case SerialPortEvent.DSR :
                   case SerialPortEvent.RI :
                   case SerialPortEvent.OUTPUT_BUFFER_EMPTY :
                        break;
                   case SerialPortEvent.DATA_AVAILABLE :
                        byte[] readBuffer = new byte[20];
                        try
                             while ( in.available() > 0 )
                                  int numBytes = in.read(readBuffer);
                             System.out.println(new String(readBuffer));
                        catch (IOException e)
                        break;
         public static void main(String[] args)
             System.out.println(args[0]);
             if ( args.length < 1 )
                   System.out.println("Use java SerialRead <port name>");
              else
                   SerialRead reading = new SerialRead(args[0]);
    }

    Hi Abdul,
    Check if the data you are getting for the two fields in your data loading are correct. I think it has some illegal characters(I guess it has a trailing space). If possible try and correct it in the source system itself.
    If you are doing full load for the master data, everytime the system is going to pull these values in and throw error. One other way of avoiding it will be to maintain routines to eliminate these.
    The quickfix will be to correct it at the PSA level and load it again.
    Regards.

  • Problem with the SQL Data Source (Essbase server in Unix Env)

    I too have the same problem.
    I have setup the Datasource for an Oracle db in the Essbase Server (which is on a Unix Box) using the Oracle Wire Protocol.I tested the ODBC connection in the Server and it connect with the credentials.
    But if i choose the same DSN in the EAS and try to connect i get the below error
    Error: 1021001 Failed to Establish Connection With SQL Database Server.
    I even downloaded a document from the Oracle site.. on the SQL interface in Essbase but i could not find the inst-sql.sh file to enable Essbase SQL Interface on the server. Is this required??
    I tried with both the User & System DSN on the server but no luck.
    Any help on this will be highly appreciated. Thanks in advance.
    Posting in a new thread since the old one is answered and no points available to score.

    It can depend on the version, it is different for version 11, further information :- How to define a relational data source in (odbc.ini) V11 Essbase?
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Problems with uploading master data in BI 2004S

    Hi all,
    I wanna upload master data in FI from R/3 System to BI System. I created its <b>transformation</b>. Then, I <b>migrated</b> it WITH EXPORT option. After that, I created <b>Data Transfer Process</b> (DTP), I didn't change any thing, just "Save and Active". Then I created <b>InfoPackage</b> for it => choose Schedule tab and press Start. Open DTP and choose Execute tab -> press Execute.
    1. In the case of successful process (the log showed no  errors), How can we check that data is transferred successfully?
    2. <b><u>Case 2:</u></b> I did all above steps for a data source. But when I opened InfoPackage -> it said <b>"This InfoPackage has been already in Process Chain"</b>... I still continued to execute its DTP but I failed. The log said that the data was duplicated.
    What does it mean? And what should I do in this case?
    3. <b><u>Case 3</u></b>: I have a data souce Account Type with <0ACCT_TYPE_TEXT> (icon dialog). I migrated it but failed. Then, after I refreshed, this datasource moved to the group that had been migrated before (in the end of the tree) but <0ACC_TYPE_TEXT> disappeared.
    So, I cannot create DTP or InfoPackage for it to transfer. In this case, Do I have to re-create a DataSource? What should I do?
    Thanks in advance.

    case 1:
    Check there might be duplicate record. as it is a text datasource language may be maintained. U can check in monitor what record is showing error.
    case 2:
    If u migrate the data source then it is a BI7.0 version.
    If u want to recover 3.x datasource then u need to use RSDS .
    Then again u can create Data Transaformation and DTP and migrate once again.
    then load the data.
    Normally if u r working on BI7.0 do all the stuffs(DT, DTP) using 3.X version then do the migration of datasource else if u migrate first and do the transformation then u will have to manual mapping and u may missed and do a lot of mistakes.
    Thanks,
    Debasish

  • HT201699 I'm using an iPhone 4 which i purchased last month, there seems to be some kind of problem with the mobile data network

    Hi,
    I bought a new iPhone 4 last month, totally fell in love with it untill there was this problem regarding the apn disappearing. I manually added the apn info and it worked fine for some time but soon after I disabled the mobile data/3g connection and enabled it later I always use to get a msg saying "Could not activate mobile data network" even after it being active( 3g symbol always used to be poped up at the top). Though the apn is filled with a valid data it shows me the same msg over and over again. It some times works on rebooting but the rebooting the handset everytime i want to use the connection sounds very pestering. Please suggest me with somehting to get rid of this problem...

    https://discussions.apple.com/message/21946322#21946322

  • Problem with Material Master Data Upload using MM01?

    Dear Abapers,
    I done recording for uploading Material Master using MM01. I selected respective views at the time of recording. Then at the time of uploading it is not selecting the Quality view and Account1 view. Even I used 'P+' function code. it is working but not selecting the 2 views. Manually, if I select those 2 views uploading successfully.
    Could you please let me clear on this issue?
    Regards,
    Sekhar

    Hi Chandrasekar,
      Scrolling in the view-selecting screen wont be recorded during SHBD ( Recording ).So you need to first select Basic data1 rather than Accounting 1 or Quality management.Once you are in the basic data screen, fill the required description( if its unavailable ) and then choose the accounting 1 or quality tab.Do a fresh recording with the above approach, your issue should be solved.
    Regards,
    Kiran
    Edited by: Kiran NN on Feb 29, 2008 11:02 AM

  • Facing problem with standard master data source *0bp_def_address_attr *

    Hi All,
    I am using BW 3.5 version
    I am loading master data form 0bp_def_address_attr(stabdard data source)  as part of my daily activities.
    this loading is failing frequently throwing following error message
    30 duplicate record found. 1243 recordings used in table /BI0/PBPARTNER
    ID : RSDMD
    message number 199
    Is there any way I can avoid this failure?How can I handle duplicate records?
    Please help me!!
    Warm regards
    Nanduri Aditya

    Thanks for your answers and sorry for my late reply
    Hi Srinivas,
    I am using  3.5  there is no ' handle duplicate records' option in info pacakge.
    Hi Francisco,
    Thanks for your answers,
    2 OSS notes are relevant to my problem. Currently I am checking in Dev whether it will fix my problem or not.
    Thanks a lot it was avery helpful answer.
    Warm Regrads,
    Nanduri Aditya

  • Problem with the Posting Date of Goods Receipt or Invoice Receipt for PO

    Hi All,
    I am working on 2LIS_02_SCL - Purchasing Data (Schedule Line Level),
    For our report, we should get the scheduled PO quantity,actual PO qty, Scheduled date and actual date in the report.
    we are getting all these fields from the above extractor...
    For a PO which is opened (Delivery Completion indicator not set) - we where getting all field as in ECC(ME23N)
    But for a PO which is finished(Delivery completion indicator is set) - we where are getting actual GR qty,planned qty,planned or scheduled date at schedule line level.But we are not getting the Posting Date of Goods Receipt or Invoice Receipt for PO from history EKBE for all the schedule line.
    ex:
    for open PO
    PO num         item num   schedule line    Planned date   actual date  difference  actual qty  planned qty  difference
    450002432   10                        1              01/10/2011    02/10/2011       1 day          100             100             0
                                                     2              02/10/2011    04/10/2011       2 days        200             400            200
                                                     3              03/10/2011    07/10/2011       4 days         300            300              0
    For the same PO when it is Finished or completed(Delivery completion indicator is set)
    PO num         item num   schedule line    Planned date   actual date  difference  actual qty  planned qty  difference
    450002432   10                        1              01/10/2011    07/10/2011       6 day          100             100             0
                                                     2              02/10/2011    07/10/2011       5 days        200             400            200
                                                                    03/10/2011    07/10/2011       4 days         300            300              0
    for all the schedule line its displaying the last posting date(i think its taking from EKKO table)
    Is there any possibility to get that posting date history ....

    You can have multiple receipts against a PO schedule line, but this extractor is not meant to go to that level of granularity.  Please try to use 2lis_02_sgr instead which provides goods receipts per PO schedule line.

  • Performance problem with the bulk data

    Hi,
    I have created the mapping to load the fact table and succesfully deployed.
    When i tried to do the Initial load with the 12 million records,it is taking arround 90 hours.
    To reduce this time ,what performance measures we can take.
    Following is the little brief about the fact table.
    I am doing the outer join of two source tables.While loading the fact table ,i am doing a lookup into 4 dimension tables.For two dimension tables i am passing 17 fields to get the keys.We dont have indexes on the sources.
    I would be thankful if you can help me out.
    Thanks
    Vinay

    Hi Jean,
    One point,i have not mentioned correctly.I am giving below the the answers to your questions again.
    1.I am using SET based fail over to row based.
    2.Yes,i can expect a decent load times on the unix.
    3.I am not loading in parallel inserts.
    4.Yes, i am disabling the constraints/indexes on the fact table when loading.
    5.On the execution plan of the query, the joins being done are using NESTED_LOOPS and HASH_JOINS?
    6.There is only one not null key in the 2 dimension table.To get this key , i am passing 17 fields from the source tables.I dont have the primary key/Foriegn key references betweeen the fact table and the dimension tables.
    Hope i have given the complete information to you.
    Thanks
    Vinay

  • Problem with the Catalog Data

    Hi All,
    It has been observed that the Catalog Data object sometimes goes for a toss and doesnt fetch any data anymore. In that case, we used to simply logout and login again. But now we have moved to IConnector Services and can't logout and login again. As the same Ctalog data gets used by some other application, even the other application fails. Was just wondering if it was possible to reset or refersh the Catalog Data object so that it starts functioning normally again.
    Regards,
    Tanveer.

    1. Selecting and renaming each section would take me just as long as copy/pasting each section as well.
    Dinesh - Please note down the "cell range"  which you want to import from excel document. In Excel import dialog mention this cell range and ID will import the specified cell range only.
    2. When selecting file>place, I can't have the excel workbook open, which I still need to reference information.
    Dinesh - While placing you won't be able to open the Excel file simultaneously because ID is reading information from this file. But once you have placed the file in InDesign then you should be able to open it.

  • Problem with the ECC6 Data sources after migration

    Dear All.
    I transfer all the data source which is required for sales overview cube 0SD_C03.
    lets take two senairos.
    for example take 2LIS_11_VAHDR to BI 7 and then i migrate this data source with Export option after that i activate the data source and then i initialize the delta request with data.
    the request successfully generated in BI to R/3 but in monitor it shows in yellow for long period of time and no data arrive i wait for 30 min but nothing happen.
    in second one i initialize the delta process for data source 2LIS_12_VCHDR this data source i did't migrate but face the same situation.
    can you please explain what will be the cause of this and how i can manage to fetch the data from ECC6 client through Business content data source.
    i did not migrate the update rules into transformations.
    any help will be highly appreciated.
    kind regards,
    Zeeshan

    yes that is what exactly happen i did not do any thing as in the previous update rules there are written complex abap routine.
    please let me know how to generate the update rules as was there previously or if i have to revert the datasource back to the previous position what i have to do.
    i have figure out that why the data is not coming from SAP just need steps that how to revert the data source or make the new transformations with generated update rules or transformation.
    any help will be highly appreciated.
    Kind Regards,
    Zeeshan
    Edited by: Zeeshan haider on Jun 2, 2009 6:02 PM

  • Problems with maintaining master data for infoobject 0SOLD_TO

    Hello experts,
    I'm not getting 'maintain master dada' option when I right click on the infoobject 0SOLD_TO. The data is already there from pervious data loads (from R/3 systems) and is visible in the reports. I'm using this infoobject in cube 0SD_C05 to report quotation/order details. Am I missing something here? Please guide.
    Thanks
    Sumit

    Hi Sumit,
    It looks you have asked your question in a wrong forum.
    Please use BI forum to post your question.
    Regards,
    Pooja

  • Problem with the field length restrictions in the WSDL file

    Hi all,
    We have created a XSD file where we have defined fields and given some restrictions (like minLength, maxLength) for each field. See below one ex of one element "Id":
    {code     <xs:simpleType name="Id">
              <xs:restriction base="xs:string">
                   <xs:maxLength value="40"/>
              </xs:restriction>
         </xs:simpleType>
    {code}
    Here we have defined maxLength of this field as 40 chars. Our WSDL uses (refers/import) this XSD file and we ganerates java skeleton using RAD. But at runtime if we set more than 40 chars then also it is accepting. It is not throwing an exception. (In the generated java skeletion these restrictions are not reflected antwhere)
    I have one question that, if such restrictions defined in the XSD file works or not? and is it a industry standard to define restriction in the XSD file?
    If yes then what i need to do more to make it working?
    If not then is there any way to do such validation of the fields that are input to the webservice? Or shall i have to just write my own java class to validate each field?
    Regards,
    Ravi

    Or is it possible that we give length restrictions in the XSD (and import this XSD in WSDL) and generate java skeleton from WSDL then the restrictions defined in XSD are mapped into java classes?
    For ex:
    <xs:simpleType name="Id">
        <xs:restriction base="xs:string">
            <xs:maxLength value="40"/>
        </xs:restriction>
    </xs:simpleType>so when in generated java skeleton we set value to "Id" element which is more than 40 charsthen it should throw a exception?
    Is it possible by default or do we need to write custom validation classes to do validations on such fields?
    Has anybody worked in such scenerios?
    Or how to do field validations in webservice? Simple question.
    Thanks In Advance.

Maybe you are looking for

  • GIF decoding

    This works on WinXP but not Linux. Why? It takes the first frame of an animated gif and writes a thumbnail.             GifDecoder d = new GifDecoder();                 try {                 fis = new FileInputStream(file);                          

  • Trying to upload all my music onto a new computer. I plug in my ipod, it recognises but theres no music there? help!

    Hi, I am trying to upload all my music from my Ipod Nano to my new laptop, it recognises the name, but there is no music there. I did the same with my bf's and it recognised all the music no problem. Help please x

  • Basis patch error

    iam installing basis 11 th patch it is executing from 12 hours  now it is going on and  background process also running it shows ocs_queue_import shows 78550 seconds duration.but opening log it shows error.how can i slove this error.please send solut

  • Share Airport Disk over WAN via iCloud

    Does anyone have info on how 7.6.1 allows WAN access via an iCloud account?  Supposedly that was a new feature and I can't find anything about it. Sorry for everyone's frustrations.  My 7.6.1 update went fine on two Extremes & an Express.  7.5.2 was

  • Frozen "Terms and Conditions" screen after trying to update iOS 8.0.2

    Can anyone help me with this problem? Update for iOS 8.0.2 freezes at the "Terms and Conditions" page. I also noticed that a bitton in an apple support email would not work.