Data loading....reporting requiment

Hi ,Experts ... we have critical requirement here..
we have two data that is Actual data and SP(spring plan-sales forecast) Data with us..
The requirement for the particular report will look like this.
1)For Actuals , May'09 - Aug'09 data to be loaded
2)For previos version sep'09 sales forecast data to be loaded
3)From 0ct'09 to Apr'10 the RSF data to be loaded..
I need the approach , how to proceed with..(will we have to go for 2 cubes)?

I have to tell you, as a consultant I have two thoughts on posts like this:
1) The angel on my right shoulder agrees with Glenn -- "Someone should guide them. They know not what what they do. Competent consultants seem expensive but are cheap in the long run. Help them by working for them for a mere day. Or even help them on OTN with silly stories." This path leads to abasement, penury, and a clean conscience.
2) The devil on my left says -- "Mwuhahahah! Let them crash and burn. The more the merrier. They bite the dangerous fruit of Essbase and taste the bitter ashes of poorly designed databases in their mouth. Then *you* swoop in and look like a genius -- I have taught you well in the Black Arts of Essbase, oh acolyte. But *they* have not sacrificed to the mighty Arbor Tree . Mwuhahahah!" This path leads to ego, great wealth (relatively speaking), and the Portrait of Dorian Gray.
Okay, I exaggerate. And why do mythological figures always speak in outdated English? Yes, they're old, but can't they pick up new speech patterns? But I digress from my digression.
Glenn is right -- a little money up front will be worth untold amounts in the end. Once you've committed to a bad design, you are really stuck. The boards and blogs can help, but there's no substitute for someone who's done it before, made the mistakes, and is ready and willing to help you avoid them and create an awesome system.
Just my $0.02.
Regards,
Cameron Lackpour

Similar Messages

  • Data Load Reporting Tool

    We check our daily loads every day and manually update an excel about the information of successful or failed loads.
    The company would like to have an automatic report with this information. Is there anything already made for this? I did not find in the Business Content.
    I was thinking in creating an InfoCube with information from tables like RSMONICDP and others, where this information could be already stored in the system.
    Thanks in advance,
    Pablo

    A big weak spot of the BW is the lack of monitoring tools for loads, but also a whole bunch of other operational areas.  To its credit, SAP generally has all the pieces captured, but just needs to pull it together, really what is needed is an BW Admin cockpit, to improve monitoring.  Many people want what you are asking for. You can build a lot of stuff, but much more needs to be available out of the box.
    I suspect the lack / weakness of these kind of admin tools may have to do with the developers not having to "eat their own cooking".  Not an unusual thing in the IT world. Admin tools generally don't sell a product, so they don't percolate to the top of the priority list very often.  At companies where the product is also run as a service offering and the developers are in the  cubicle next to the Admin/Operations folks, it usually is better.
    I wonder how many production BWs the development staff  actually support on a daily basis.
    It would be nice to see SAP survey the client base asking about Admin/Operational support improvements.

  • How to create a report in bex based on last data loaded in cube?

    I have to create a query with predefined filter based upon "Latest SAP date" i.e. user only want to see the very latest situation from the last load. The report should only show the latest inventory stock situation from the last load. As I'm new to Bex not able to find the way how to achieve this. Is there any time characteristic which hold the last update date of a cube? Please help and suggest how to achieve the same.
    Thanks in advance.

    Hi Rajesh.
    Thnx fr ur suggestion.
    My requirement is little different. I build the query based upon a multiprovider. And I want to see the latest record in the report based upon only the latest date(not sys date) when data load to the cube last. This date (when the cube last loaded with the data) is not populating from any data source. I guess I have to add "0TCT_VC11" cube to my multiprovider to fetch the date when my cube is last loaded with data. Please correct me if I'm wrong.
    Thanx in advance.

  • Report Preview Tab and data loading

    I have 2 questions:
    When I first started developing a report, when I opened the file I had both the design and preview tabs displayed. It seems after I downloaded and viewed 1 of the sample reports, the preview tab disappeared (tho, it's not obvious the 2 are related). I checked the box under Options/Layout/Display Preview Panel, but it did not seem to help. Is there something else I missed?
    The other question, and this may have an effect on my 1st question, is that when I open my report file, the database records are not automatically loaded (thus, nothing to preview at that point). Is there a way to set an automatic log on to the database so that the load time happens at the beginning and not when I actually want to preview results?
    Thanks in advance....Steph

    Don,
    Thanks for the response. I tried what you suggested and it does not seem to work all the time. There may be something else that I need to consider.
    I was wondering...when I do the "Save Data with Report", does it save ONLY the data? Or does it save some intelligent connection back to the database?  Also, is this function the same as the one listed under Options/Reporting? I selected the one on the File menu (as you suggest), but it doesn't seem to change the selection on the Options/Reporting window.
    Apologies for my attempt at a "2-fer-1". I only did that because I believed the 2 might be related...maybe not. I'll repost as a separate issue/concern.
      Cheers...Steph

  • Errors in data loading & BEx reporting.........,

    Hi all,
    should i know that what are the possible erros occur in data extraction & reporting .....,
    how would we rectify it in real time ..........,
    can anyone tell me tips please.....all masters please advice me some solutions based on real time plz..,
    Regards
    swetha

    Hi Swetha,
    I guess you are new to BW. Most of the performance problems are best learnt on the job. The more the time you would spend on the system the more you would learn. This forum has lots of info on the possible errors. You could search for the error you face.
    The common errors during extraction & reporting are:
    1. Timestamp error.
    2. Invalid values
    3. Tablespace Issue
    4. Job getting cancelled due to lack of resources.
    5. Deadlocks
    6. No data available for reporting
    7. Job running for longtime
    etc..etc..etc...
    The list is too long to be discussed here. You could refer to the web log by Siggi. /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    You can serach through the forum by using the key words I mentioned.
    Hope this helps.
    Bye
    Dinesh

  • Refresh Webi reports automatically after BI data loads ?

    Hello,
    We are about to install BOE X.1 on top of BI 701.
    The idea would be that the users would perform their WebI daily reporting, but obviously only after SAP BI night batch scheduling is finished, and the BI infoproviders filled.
    I've read that BOE ofers you the ability to refresh the reports upfront.
    We were wondering if there is a way to easily implement this logical dependance : to refresh the Webi Report only after the end of BI data loads.
    There is off course, the possibility to use an external scheduler, but we have checked the licencing and it's quite expensive.
    Is there another way to do so ?
    Many thanks for your attention.
    Best Regards.
    Raoul

    Hi Alan,
    Thank you very much for your quick answer.
    I would like to make sure that I understand you since I'm not very familliar with BOE :
    First , we create an event in the CMC and connect it to a file location
    Then we schedule the document and add the file event : do you mean schedule the webi report  ?
    Finally create the file  as part of the Bex query refresh process : how exactly do we do that, in the BI process chains ?
    Thank you very in advance for your help.
    Best Regards.
    Raoul

  • Error while loading Reported Financial Data from Data Stream

    Hi Guys,
    I'm facing the following error while loading Reported Financial Data from Data Stream:
    Message no. UCD1003: Item "Blank" is not defined in Cons Chart of Accts 01
    The message appears in Target Data.  Item is not filled in almost 50% of the target data records and the error message appears.
    Upon deeper analysis I found that Some Items are defined with Dr./Cr. sign of + and with no breakdown.  When these items appear as negative (Cr.) in the Source Data, they are not properly loaded to the target data.  Item is not filled up, hence causing the error.
    For Example: Item "114190 - Prepayments" is defined with + Debit/Credit Sign.  When it is posted as negative / Credit in the source data, it is not properly written to the target.
    Should I need to define any breakdown category for these items?  I think there's something wrong with the Item definitions OR I'm missing something....
    I would highly appreciate your quick assistance in this.
    Kind regards,
    Amir

    Found the answer with OSS Note: 642591..... 
    Thanks

  • Data loaded in the target are not reportable.

    Dear all,
    In one of our cubes, data is loaded daily. A couple of days back a load failed with ABAP Dump " Raise exception" - subsequently the other days have data loaded but are found to be not reportable.
    How to handle this.
    Can anyone provide reasons for the error and subsequently making the requests reportable in steps.
    Experts help will be appreciated.
    Note:- Further details of the error gives details as "ADD_PARTITION_FAILED"
    Regards,
    M.M

    Hi,
      can you have look in ST22 for Short dumps on the day this load got failed, there you can get some more details of the error. it will help us to understood your error.
    have a look to the below SAP Note:539757
    Summary
    Symptom
    The process of loading transaction data fails because a new partition cannot be added to the F fact table. The loading process terminates with a short dump.
    Other terms
    RSDU_TABLE_ADD_PARTITION_ORA , RSDU_TABLE_ADD_PARTITION_FAILED , TABART_INCONSITENCY, TSORA , TAORA , CATALOG
    Reason and Prerequisites
    The possible causes are:
    SQL errors when creating the partition.
    Inconsistencies in the Data Dictionary control tables TAORA and TSORA. There must be valid entries for the data type of the table in these tables.
    Solution
    BW 3.0A & BW 3.0B
               If there are SQL errors: Analyze the SQL code in the system log or short dump and if possible, eliminate the cause. The cause is often a disk space problem or lock situations on the database catalog or, less frequently: The partitioning option in the ORACLE database is not installed.
               In most cases, there are inconsistencies in the Data Dictionary control tables TAORA and TSORA. As of Support Package 14 for BW 3.0B/Support Package 8 for BW 3.1C, the TABART_INCONSITENCY exception is issued in this case.
               1. ) Check whether there is an entry for the corresponding data type of the table in the TAORA table.
               The TAORA table contains the assignment of data classes to data tablespaces and their attributes, for example:
               Tablespace Data class
               DDIM      PSAPDIMD   . .......
               DFACT     PSAPFACTD  ........
               DODS      PSAPODSD   . ......
               2. ) Check whether the TSORA contains an entry for the tablespace from the TAORA table.
               The TSORA table contains the assignment of the data tablespace to the index tablespace.
               TABSPACE                      INDSPACE
               PSAPDIMD                      PSAPDIMD
               PSAPFACTD                      PSAPFACTD
               PSAPODSD                      PSAPODSD
               For more information, see Notes 502989 and 46272.

  • How to make data loaded into cube NOT ready for reporting

    Hi Gurus: Is there a way by which data loaded into cube, can be made NOT available for reporting.
    Please suggest. <removed>
    Thanks

    See, by default a request that has been loaded to a cube will be available for reporting. Bow if you have an aggregate, the system needs this new request to be rolled up to the aggregate as well, before it is available for reporting...reason? Becasue we just write queries for the cube, and not for the aggregate, so you only know if a query will hit a particular aggregate at its runtime. Which means that if a query gets data from the aggregate or the cube, it should ultimately get the same data in both the cases. Now if a request is added to the cube, but not to the aggregate, then there will be different data in both these objects. The system takes the safer route of not making the 'unrolled' up data visible at all, rather than having inconsistent data.
    Hope this helps...

  • Training on CalcScripts, Reporting Scritps, MaxL and Data Loading

    Hi All
    I am new to this forum. I am looking for someone who can train me on topics like CalcSripts, Reporting Scripts, MaxL and Data Loading.
    I am willing to pay for your time. Please let me know.
    Thanks

    Hi Friend,
    As you seems to be new to essbase,you must learn What is Essbase, OLAP, Difference between Dense & Sparse, and then use "essbase tech ref" for more reference
    After that go through
    https://blogs.oracle.com/HyperionPlanning/and start exploring CalcScript, Maxl etc
    and all this for you free free free..........
    Thanks,
    Avneet

  • Loading Reporting data: "time frame" "frequency" "update times"

    In a requirements package from which I am to design and build BW reporting, information has been provided on "time frame" "frequency" "update times"
    Can you share your thoughts on these and how to consider them during the design/build? Examples will be very helpful. Also, how does “time basis” fit in here?
    For example what is the implication when on a particular requested report, the frequency= "on demand".
    If time frame = quarterly what effect will that be on “update time”?
    Which of these speak to when data needs to be loaded into the BW? Which refers to when reports need to be generated? And is there a link between the two?
    How is the determination on the frequency and specific time to load data for reporting purposes made?
    In general, what influences the decisions on "time frame" "frequency" "update times" during the reuirements gathering phase?

    There is not really a set answer for that, basically you have to listen to what your users need, and what you can offer them based on your installed base.
    "Time basis" I'm not sure about, could refer to the type of timefield you are using. Transaction date, ...
    Let me give you an example
    Eg. Users here want there reports on wednesday morning, this is the only time that they will look at the reports.
    Frequency: You can get away with loading once a week
    Time frame: Because all my users will be in the system as of 5 am, I need to make sure data is ready by then. But I can't load before 1am because that is when FICO data is loading already...
    Update times: we could do some updates of new plan version during the rest of the week, since they are relatively small. (although I'm not really sure what they mean with update time, sounds the same like loading frequency)

  • Data loading

    Requirement is that enhanced the extractor by adding fields mapped the fields in bw in DSO,added in dimension in cube and also mapped in multiprovider ,on multiprovider report is build.
    Now for testing need to reflect the data in report.
    How can it achieve ,need to load the data from infoobjects DSO to cube is it the right approach.
    Please suggest how to achieve it starting from loading infoobject then to DSO and then to cube and finally to reflect the data in report.

    Hi there,
    I guess you've said it all.
    You enhance a datasource with new fields, and you've also added those new fields in the DSO and also mapped from the datasource to the DSO these new fields. You've also add the fields in the cube and mapp the fields from the DSO to the cube. You've also add the fileds in the MultiProvider and have identified the fields there. You've also changed the query with the new fields
    So for loading the data, you should load the attributes and/or texts for the new InfoObjects first.
    After that activate the master data for those new fields.
    After that load the data from the datasource to the DSO and activate the DSO
    After that load the data from the DSO to the InfoCube and after that you can check the report with the contents of this new data (with the new fields).
    Hope this helps,
    Regards,
    Diogo.

  • Error in report generation-Load report failed

    I have installed crystal reports  for VS2005 runtime in my live environment.When i try to load the report file from a .Net page i am getting the error'Error in Report generation-Load report failed'.
    This works fine in development and testing but when moved to production we are having issues.
    I believe i have a workaround i.e i need to open the rpt in a VS 2005 environment and re-write the connection string again.This fix had worked in development and testing when we had the same issue.
    But not sure if this is a known issue with crystal reports and if so do we have any hot fixes for the same.
    Please help.

    Hi Don,
    The reason why i posted the item on 'data Connectivity issues' is because i suspect the problem is with the connection string.Though i had provided the correct connection string,re-writing the connection string with the same info might help to fix the issue with loading the report.
    Does it mean that there is a problem in the connectivity?
    I have a risk here that i cant re-write the connection string..i.e to open the RPT file in VS2005 and bind it.The production web server & database server sits in a  third party environment and it is not in the company network.So i cant connect to this server to re-write from any other severs in the network.
    Is there any fix which can applied to the server to get this issue resolved?
    Cheers
    Nithya.

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • Problem with Master Data Load

    Dear Experts,
    If somebody can help me by the following case, please give me some solution. Iu2019m working in a project BI 7.0 were needed to delete master data for an InfoObject material. The way that I took for this was through tcode u201CS14u201D. After that, I have tried to load again the master data, but the process was broken and the load done to half data.
    This it is the error:
    Second attempt to write record 'YY99993' to /BIC/PYYYY00006 failed
    Message no. RSDMD218
    Diagnosis
    During the master data update, the master data tables are read to determine which records of the data package that was passed have to be inserted, updated, or modified. Some records are inserted in the master data table by a concurrently running request between reading the tables at the start of package processing and the actual record insertion at the end of package processing.
    The master data update tries to overwrite the records inserted by the concurrently running process, but the database record modification returns an unexpected error.
    Procedure
    u2022     Check if the values of the master data record with the key specified in this message are updated correctly.
    u2022     Run the RSRV master data test "Time Overlaps of Load Requests" and enter the current request to analyze which requests are running concurrently and may have affected the master data update process.
    u2022     Re-schedule the master data load process to avoid such situations in future.
    u2022     Read SAP note 668466 to get more information about master data update scheduling.
    Other hand, the SID table in the master data product is empty.
    Thanks for you well!
    Luis

    Dear Daya,
    Thank for your help, but I was applied your suggesting. I sent to OSS with the following details:
    We are on BI 7.0 (system ID DXX)
    While loading Master Data for infoobject XXXX00001 (main characteristic in our system u2013 like material) we are facing the following error:
    Yellow warning u201CSecond attempt to write record u20182.347.263u2019 to BIC/ XXXX00001 was successfulu201D
    We are loading the Master data from data source ZD_BW_XXXXXXX (from APO system) through the DTP ZD_BW_XXXXX / XXX130 -> XXXX00001
    The Master Data tables (S, P, X) are not updated properly.
    The following reparing actions have been taken so far:
    1.     Delete all related transactional and master data, by checking all relation (tcode SLG1 à RSDMD, MD_DEL)
    2.     Follow instructions from OSS 632931 (tcode RSRV)
    3.     Run report RSDMD_CHECKPRG_ALL from tcode SE38 (using both check and repair options).
    After deleting all data, the previous tests were ok, but once we load new master data, the same problem appears again, and the report RSDMD_CHECKPRG_ALL gives the following error.
    u201CCharacteristic XXXX00001: error fund during this test.u201D
    The RSRV check for u201CCompare sizes of P and X and/or Q and Y tables for characteristic XXXX00001u201D is shown below:
    Characteristic XXXX00001: Table /BIC/ PXXXX00001, /BIC/ XXXXX00001 are not consistent 351.196 derivation.
    It seems that our problem is described in OSS 1143433 (SP13), even if we already are in SP16.
    Could somebody please help us, and let us know how to solve the problem?
    Thank for all,
    Luis

  • Data load Error to Planning

    Hi All,
    I have created One Interface for loading data to Planning Application i have make the Account Dimension as Load dimension and Segment as Driver dimension (member is Audio)
    I have created one form for this now when i am running the interface its get successfully executed but at report statics it showing following problem
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 2, in ?
    Planning Writer Load Summary:
         Number of rows successfully processed: 0
         Number of rows rejected: 48
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:345)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:169)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2374)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1615)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1580)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java:2755)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java:68)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2515)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:534)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:449)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1954)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:322)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:224)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:246)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:237)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:794)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:114)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:619)
    I have find out a log file for it which says :
    Account,Data Load Cube Name,MP3,Point-of-View,Error_Reason
    Max Hits Per Day,Consol,Jan,"USD,Cannot load dimension member, error message is: java.lang.IllegalArgumentException: Unbalanced double quotes found at character 3 in the member list: "USD
    Avg Hits Per Day,Consol,Jan,"USD,Cannot load dimension member, error message is: java.lang.IllegalArgumentException: Unbalanced double quotes found at character 3 in the member list: "USD
    Can any buddy tell me what i did wrong. My Data file not contains any Double Quote mark with USD the POV is like this :---> USD,E01_0 ,Jan,Current,Working,FY11
    Thanks
    Gourav Atalkar
    Edited by: Gourav Atalkar(ELT) on May 3, 2011 4:06 AM

    As it is comma separated and all the members for the POV need to be together try enclosing it in quotes e.g "USD,E01_0 ,Jan,Current,Working,FY11"
    Cheers
    John
    http://john-goodwin.blogspot.com/

Maybe you are looking for

  • Time Capsule no longer backs up since I've moved

    Hi all, this is the second time that this has happened. My TimeCapsule was working fine, with no issues, until I boxed it up because I was moving to another apartment. I set it up on the other end, and re-established my wireless network (still with t

  • Media Manager 9.6.12 won't run; insists that I "upgrade" to same version, refuses to run

    Every time I run Media Manager, an Upgrade window appears with the options to "Quit" or "Upgrade". I can't use Media Manager because I can't dismiss this upgrade box without selecting one of those two buttons.  - Clicking "Quit" exists the upgrade ap

  • ACR missing from Bridge CS3 on Mac

    I am working on a MacBook Pro with Adobe CS3. When I originially installed CS3 running OSX 10.4 the "Camera Raw Preferences" under the file menu was missing. I also could not "Develop Settings" or any other ACR functions from Bridge. I was able to op

  • Referenced Photos

    I am trying the PSE 9 trial and coming from a background of using Aperture, but I am quite comfortable with Bridge, and have used the Lightroom 2 and 3 trials. Aperture has two ways of handling photos: managed or referenced, and I prefer 'referencing

  • Document Type&Document category

    Hi Friends, I am working subcontract scenario.I have 3 operations.For second operation is subcontract operation.When i was create te order system gave the error please find the error below Document Type NB not allowed with document category B. Please