SQL*LOADER ERROR WHILE LOADING ARABIAN DATA INTO UNICODE DATABSE

Hi,
I was trying to load arabic data using sql*loader and the datafile is in .CSV format.But i am facing a error Value to large for a column while loading and some data are not loaded due to this error.My target database character set is..
Characterset : AL32UTF8
National Character set: AL16UTF16
DB version:-10g release 2
OS:-Cent OS 5.0/redhat linux 5.0
I have specified the characterset AR8MSWIN1256/AR8ISO8859P6/AL32UTF8/UTF8 separately in the sql*loader control file,but getting the same error for all the cases.
I have also created the table with CHAR semantics and have specified the "LENGTH SEMANTICS CHAR" in the sql*loader control file but again same error is coming.
I have also changed the NLS_LANG setting.
I am getting stunned that the data that i am goin to load using sql*loader, it is resided in the same database itself.But when i am generating a csv for those datas and trying to load using sql*loader to the same database and same table structure,i am getting this error value too large for a column.
whats the probs basically???? whether the datafile is problemetic as i am generating the csv programmetically or is there any problem in my approach of loading unicode data.
Please help...

Here's what we know from what you've posted:
1. You may be running on an unsupported operating system ... likely not the issue but who knows.
2. You are using some patch level of 10gR2 of the Oracle database but we don't know which one.
3. You've had some kind of error but we have no idea which error or the error message displayed with it.
4. You are loading data into a table but we do not have any DDL so we do not know the data types.
Perhaps you could provide a bit more information.
Perhaps a lot more. <g>

Similar Messages

  • Map Loader error while loading catalogue

    XM 5800, latest firmware, Maps 3 installed and working OK. I get "Error while loading catalogue for Nokia 5800 XpressMusic" whenever I start Map Loader 3.
    Uninstalled/reinstalled Map Loader 3 several times.
    Tried suggested fixes of deleting qf and cities folder then running Maps to recreate file structure.
    Tried it on a Windows 7 PC with PC Suite and also a Vista PC with Ovi Suite. Same error message.
    I am left without the ability to download any maps except over the air as I go incurring data charges.
    So I may as well just use Google maps!
    I see in the discussion groups this issue comes up for many different Nokia phones occasionally but I have never seen a solution.
    Anyone?
    Solved!
    Go to Solution.

    Just when I thought Nokia Care technical support knew what they were doing!!!!
    About 4pm NZ time yesterday, after the Map Loader server had been working for about 3 to 4 hours and everything was fine. I got a call from Nokia tech support. I was on the other line so they left a message.
    "Sorry, its not going to work because map loader is not completely compatible with Windows 7"
    and my email support thread finally pointed me to a "Solution" discussion board entry detailing a fix for Maps 2. You had to download the latest version of Map Loader 2.02!!!!! That was really helpful considering we are on 3.03 and 3.0.28 respectively
    So it all stating to work again may have just been coincidence and nothing to do with our support calls! :-<
    Oh well at least its working. Now all I have to fix is the MTP USB Device driver failing.
    Cheers
    Brian

  • Error while writing the data into the file . can u please help in this.

    The following error i am getting while writing the data into the file.
    <bindingFault xmlns="http://schemas.oracle.com/bpel/extension">
    <part name="code">
    <code>null</code>
    </part>
    <part name="summary">
    <summary>file:/C:/oracle/OraBPELPM_1/integration/orabpel/domains/default/tmp/
    .bpel_MainDispatchProcess_1.0.jar/IntermediateOutputFile.wsdl
    [ Write_ptt::Write(Root-Element) ] - WSIF JCA Execute of operation
    'Write' failed due to: Error in opening
    file for writing. Cannot open file:
    C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\
    BPEL_Import_with_Dynamic_Transformation\WORKDIRS\SampleImportProcess1\input for writing. ;
    nested exception is: ORABPEL-11058 Error in opening file for writing.
    Cannot open file: C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\
    BPEL_Import_with_Dynamic_Transformation
    \WORKDIRS\SampleImportProcess1\input for writing. Please ensure 1.
    Specified output Dir has write permission 2.
    Output filename has not exceeded the max chararters allowed by the
    OS and 3. Local File System has enough space
    .</summary>
    </part>
    <part name="detail">
    <detail>null</detail>
    </part>
    </bindingFault>

    Hi there,
    Have you verified the suggestions in the error message?
    Cannot open file: C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\BPEL_Import_with_Dynamic_Transformation\WORKDIRS\SampleImportProcess1\input for writing.
    Please ensure
    1. Specified output Dir has write permission
    2. Output filename has not exceeded the max chararters allowed by the OS and
    3. Local File System has enough space
    I am also curious why you are writing to a directory with the name "..\SampleImportProcess1\input" ?

  • SQL Loader error while loading a date field

    Hi,
    I am getting the below error while I am trying to load a table with a date field using SQL Loader
    Record 1: Rejected - Error on table RPT_HOST_USAGE, column USAGE_TIMESTAMP.
    ORA-01861: literal does not match format string
    My input file is as below
    <code>
    Host_Usage_ID,Host_ID,Technology_ID,Environment_ID,Usage_Timestamp,Avg_CPU_Pct,Avg_Memory_MB,CPU_Spike
    1,12,1,8,'2009-08-01 00:00:00',0.000000000,23875.000000000,0.000000000
    <code>
    My Loader.ctl is
    <code>
    OPTIONS (SKIP=1)
    load data
    infile 'C:\rpt_Host_Usage.txt'
    into table RPT_HOST_USAGE
    fields terminated by ","
    HOST_USAGE_ID,
    HOST_ID,
    TECHNOLOGY_ID,
    ENVIRONMENT_ID,
    USAGE_TIMESTAMP,
    AVG_CPU_PCT,
    AVG_MEMORY_MB,
    CPU_SPIKE
    <code>
    I have tried options like USAGE_TIMESTAMP TO_DATE(USAGE_TIMESTAMP,'YYYY-MM-DD HH24:MI:SS') but didn't work...
    Can you please tell me how to resolve this issue?
    Any pointers on this will be helpful
    Thanks
    Mahesh

    I went back and looked at some of my old *.ctl files and I did something simlilar to what you mentioned and it worked for me, but I had surrounded the option in double-quotes and included a colon in front of the item. Example:
    TECHNOLOGY_ID,
    ENVIRONMENT_ID,
    USAGE_TIMESTAMP "to_date(:USAGE_TIMESTAMP,'YYYY-MM-DD HH24:MI:SS')",
    AVG_CPU_PCT,
    AVG_MEMORY_MB,
    ....

  • I was getting data load error while loading profit center in BCS

    Hi,
    I was getting error message while i was loading profit center master data.The error message is the total 6 records were readed from the file 6 records contain parameter setting exceeded the user limitation of execution the data collection task"
    Just i want to load Profit center into BCS.I have create method master data upload,where i have specified fields layout as below
    Header
    Controling area
    Fiscal Year
    Posting Period
    Data Row
    Profit Cneter
    When i execute Flexible upload and method where i was getting error message saying the 6 recors are ignored
    My flat files is in this format
    Controling area  fiscal year   posting period
    2000             2008          8
    50030
    50035
    50040
    50045
    50050
    50055          
    Please some one advise me how to load it into BCS.
    Its urgent requirment and i appreciate your solutions
    Edited by: Prasad B on Oct 22, 2008 7:42 PM

    Thanks alot Dan and Eugene.
    I was following Eugene Blog for flexible update master data.I have create the way you have specififed in the document.
    I have created Method to upload Profit center and try to load the data where i am getting error.
    I am loading attributes,but not hierachy.I am not understanding why i am not able to load flat file profit center master data into BCS.
    The error is "                                                                               
    7 of 7 data rows were ignored                                                                               
    Message no. UCF7017                                                                               
    Diagnosis                                                                               
    A total of 7 data records were read from file C:\Documents and            
         Settings\Laxmikant.dube\Desktop\P. Of those, 7 records contain parameter  
         settings that exceed the user limitations for executing the data          
         collection task.                                                                               
    System response                                                                               
    The ignored records were not checked any further and will not be          
         written.                                                                               
    Procedure                                                                               
    Make sure that the ignored records are irrelevant for the current task.   
         If some of the ignored records were supposed to be written, then the      
      file probably contains an error.                                                                               
    In this case, closely examine the settings for the fields Version,       
      Fiscal Year, Period, Group Currency, and Consolidation Units in the      
      file.                                                                               
    If you have more than one characteristic with the role consolidation     
      unit (a socalled "matrix organization"), also check whether the settings 
      in the ignored records refer to valid combinations of consolidation      
      units.                                                                               
    Character string "2000,2008,8" is cut off after 4 characters                                                                               
    Message no. UCF7003                                                                               
    Diagnosis                                                                               
    The current data row contains for characteristic Controlling area the     
         value 2000,2008,8. But the maximum length for values of this              
         characteristic is only 4 characters.                                                                               
    System response                                                                               
    The character string "2000,2008,8" is cut off after 4 characters.                                                                               
    Procedure                                                                               
    Check if the string "2000,2008,8" is indeed supposed to be interpreted    
         as the value of characteristic Controlling area. If this is the case,     
         avoid exceeding the maximum length for this characteristic.                                                                               
    However, if there has been a mix-up, compare the structure of the file 
    with the definition in the upload method. If variable column widths are
    used, particularly pay attention to the number of field separators in  
    the data row involved.                                                 
    Please advise me on this.It would be great help

  • Error while inserting attachements data into  FND_LOBS

    Hi,
    I m trying to insert data into fnd_lobs using below, I didn't declare any variables for below.
    I m getting many compilation erros for below insert, Could you let me know if i have to declare any of below values.
    INSERT INTO fnd_lobs
               (file_id, file_name, file_content_type, upload_date,expiration_date, program_name, program_tag, file_data,LANGUAGE, oracle_charset, file_format)
        VALUES (l_media_id, l_filename,p_file_content_type,SYSDATE,NULL, 'FNDATTCH', NULL, EMPTY_BLOB (),'US', 'UTF8', 'binary')
               RETURNING file_data
               INTO x_blob;
    -- Load the file into the database as a BLOB
        DBMS_LOB.OPEN (fils, DBMS_LOB.lob_readonly);
        DBMS_LOB.OPEN (x_blob, DBMS_LOB.lob_readwrite);
        DBMS_LOB.loadfromfile (x_blob, fils, blob_length);   -- Close handles to blob and file
        DBMS_LOB.CLOSE (x_blob);
        DBMS_LOB.CLOSE (fils);
    Thanks.

    Could you explain what your procedure does, please. I also tried to compile it but always got error message:
    PL/SQL: SQL Statement ignored
    PLS-00385: type mismatch found at 'FIL' in SELECT...INTO
    statement
    SQL> CREATE OR REPLACE PROCEDURE loadxml12 AS
    2 fil BFILE;
    3 buffer RAW(32767);
    4 len INTEGER;
    5 insrow INTEGER;
    6 BEGIN
    7 SELECT f_lob INTO fil FROM xml_temp12 WHERE key = 1;
    8 DBMS_LOB.FILEOPEN(fil,DBMS_LOB.FILE_READONLY);
    9 len := DBMS_LOB.GETLENGTH(fil);
    10 DBMS_LOB.READ(fil,len,1,buffer);
    11 xmlgen.resetOptions;
    12 insrow := xmlgen.insertXML('xml_doc',UTL_RAW.CAST_TO_VARCHAR2(buffer));
    13 DBMS_OUTPUT.PUT_LINE(insrow);
    14 IF DBMS_LOB.FILEISOPEN(fil) = 1 THEN
    15 DBMS_LOB.FILECLOSE(fil);
    16 END IF;
    17 EXCEPTION
    18 WHEN OTHERS THEN
    19 DBMS_OUTPUT.PUT_LINE('In Exception');
    20 DBMS_OUTPUT.PUT_LINE(SQLERRM(SQLCODE));
    21 IF DBMS_LOB.FILEISOPEN(fil) = 1 THEN
    22 DBMS_LOB.FILECLOSE(fil);
    23 END IF;
    24 end;
    25 /
    Bober
    null

  • Error while activating the data into DSO

    Hi
    My base DSO is used to load 4 other data targets.
    In process chain, after the base DSO gets activated there are 4 DTPu2019s running to load the data from base DSO to other DSO and 3 cubes.
    When loading to other DSO, We have encountered an error
    Object is currently locked by BI Remote
    Lock not set for : Activating data in DSO
    Activation of M records terminated.
    1. My question is when loading the data from base DSO to other objects , how does the lock mechanism works.
    I know that we cannot load the data into base DSO, when base DSO is sending data into target.
    2. What difference does it make when loading DSO to DSO and cube parallel?
    Thanks
    Annie

    Hi Annie.....
    1. My question is when loading the data from base DSO to other objects , how does the lock mechanism works.
    I know that we cannot load the data into base DSO, when base DSO is sending data into target.
    Do you mean to say that the loading in the 2nd level DSO was successful .....but the activation failed ?
    Have you checked in SM12 that whether that 2nd level DSO is somehow locked or not ?
    Is any further targets getting loaded from this 2nd level DSO ?
    Look suppose u r loading a DSO A.........and in the mean time some load starts from DSO A to some other target(it may be DSO or a cube).........then the activation in the DSO A will fail........because since the last request in the DSO A is not activated....that request will not get considered in the subsequent load....and since the load is already in progress....system will not allow to activate any new request......
    Another option can be that DSO A is getting loaded from some other targets as well.......so since still some load is in progress in this target....it will not allow the activation....
    So check it and atart the activation again..
    2. What difference does it make when loading DSO to DSO and cube parallel?
    The main difference is that there is no activation concept in the cube....so a cube may get loaded from several targets in parallel......
    A DSO can also get loaded in parallel.......but activation should start once all the loads get completed successfully.....
    Regards,
    Debjani....

  • Data Load Error while Loading data in PSA

    Hi All,
    We have BI 7.0. While we are loading data from source system( SAP R/3) to the PSA in BW its giving the error.
    Its giving the Error: Error occured in the source system and the request turns RED.
    While We are able to Extarct data corectly in SAP R/3 through RSA3.
    Pls Suggest.
    Mayank

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (In source system if load is from R/3 or in BW if its a datamart load) (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    Also see if there is any 'sysfail' for any datapacket in SM37.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. (In source system if load is from R/3 or in BW if its a datamart load). See if its accessing/updating some tables or is not doing anything at all.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.(In source system if load is from R/3 or in BW if its a datamart load)
    5) Check in SM58 and BD87 for pending tRFCs and IDOCS.
    Once you identify you can rectify the error.
    If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the data tables.
    SM21 - System log can also be helpful.
    Thanks,
    JituK

  • ERROR while uploading the data into ztable with background processing

    Hi gurus,
    i am trying to upload the data from excel file to internal table 
    its working fine ..
    but........
    if i try to upload the data with background processing , in sm37 it is saying "error during the upload of clipboard contents".
    Regards,
    Sri

    Hi,
    FM GUI_UPLOAD doesnt work in background, use dataset to upload it from application server.
    refer below code
    *--Local Variables
      DATA : l_file  TYPE string,
             l_line  TYPE string,
             l_index TYPE sy-tabix.
    *--Clear
      CLEAR : l_file.
      l_file = p_ipfile.
    *--Read the data from application server file.
      OPEN DATASET l_file FOR INPUT IN TEXT MODE ENCODING DEFAULT.
      IF sy-subrc NE 0.
    *--Error in opening file
        MESSAGE i368(00) WITH text-005.
      ENDIF.
    *--Get all the records from the specified location.
      DO.
        READ DATASET l_file INTO l_line.
        IF sy-subrc NE 0.
          EXIT.
        ELSE.
          SPLIT l_line AT cl_abap_char_utilities=>horizontal_tab
                          INTO st_ipfile-vbeln
                               st_ipfile-posnr
                               st_ipfile-edatu
                               st_ipfile-wmeng.
          APPEND st_ipfile TO it_ipfile.
        ENDIF.
      ENDDO.
    Regards,
    Prashant

  • Error while inserting xml data into table

    Hello
    I am running thid stored procedure and this compiles correctly , but when I try to execute it gives me a error, I have been trying this form the past 3 days , could anyone please help me ASAP
    SQL> CREATE OR REPLACE PROCEDURE loadxml12 AS
    2 fil BFILE;
    3 buffer RAW(32767);
    4 len INTEGER;
    5 insrow INTEGER;
    6 BEGIN
    7 SELECT f_lob INTO fil FROM xml_temp12 WHERE key = 1;
    8 DBMS_LOB.FILEOPEN(fil,DBMS_LOB.FILE_READONLY);
    9 len := DBMS_LOB.GETLENGTH(fil);
    10 DBMS_LOB.READ(fil,len,1,buffer);
    11 xmlgen.resetOptions;
    12 insrow := xmlgen.insertXML('xml_doc',UTL_RAW.CAST_TO_VARCHAR2(buffer));
    13 DBMS_OUTPUT.PUT_LINE(insrow);
    14 IF DBMS_LOB.FILEISOPEN(fil) = 1 THEN
    15 DBMS_LOB.FILECLOSE(fil);
    16 END IF;
    17 EXCEPTION
    18 WHEN OTHERS THEN
    19 DBMS_OUTPUT.PUT_LINE('In Exception');
    20 DBMS_OUTPUT.PUT_LINE(SQLERRM(SQLCODE));
    21 IF DBMS_LOB.FILEISOPEN(fil) = 1 THEN
    22 DBMS_LOB.FILECLOSE(fil);
    23 END IF;
    24 end;
    25 /
    Procedure created.
    SQL> exec loadxml12
    In Exception
    BEGIN loadxml12; END;
    ERROR at line 1:
    ORA-20000: ORU-10028: line length overflow, limit of 255 bytes per line
    ORA-06512: at "SYS.DBMS_OUTPUT", line 99
    ORA-06512: at "SYS.DBMS_OUTPUT", line 65
    ORA-06512: at "CARCLUB_RW2.LOADXML12", line 20
    ORA-06512: at line 1
    null

    Could you explain what your procedure does, please. I also tried to compile it but always got error message:
    PL/SQL: SQL Statement ignored
    PLS-00385: type mismatch found at 'FIL' in SELECT...INTO
    statement
    SQL> CREATE OR REPLACE PROCEDURE loadxml12 AS
    2 fil BFILE;
    3 buffer RAW(32767);
    4 len INTEGER;
    5 insrow INTEGER;
    6 BEGIN
    7 SELECT f_lob INTO fil FROM xml_temp12 WHERE key = 1;
    8 DBMS_LOB.FILEOPEN(fil,DBMS_LOB.FILE_READONLY);
    9 len := DBMS_LOB.GETLENGTH(fil);
    10 DBMS_LOB.READ(fil,len,1,buffer);
    11 xmlgen.resetOptions;
    12 insrow := xmlgen.insertXML('xml_doc',UTL_RAW.CAST_TO_VARCHAR2(buffer));
    13 DBMS_OUTPUT.PUT_LINE(insrow);
    14 IF DBMS_LOB.FILEISOPEN(fil) = 1 THEN
    15 DBMS_LOB.FILECLOSE(fil);
    16 END IF;
    17 EXCEPTION
    18 WHEN OTHERS THEN
    19 DBMS_OUTPUT.PUT_LINE('In Exception');
    20 DBMS_OUTPUT.PUT_LINE(SQLERRM(SQLCODE));
    21 IF DBMS_LOB.FILEISOPEN(fil) = 1 THEN
    22 DBMS_LOB.FILECLOSE(fil);
    23 END IF;
    24 end;
    25 /
    Bober
    null

  • Caller-70 Error while loading master data into infoobject

    hi ,
    I am getting following error while loading master data into infoobject (0tb-account). I am loading this master data in production environment for the first time. there are about 300000 records. All have got loaded upto PSA. Infopackage settings were PSA and then into data target.
    Short dump in the Warehouse
    Diagnosis
    The data update was not finished. A short dump has probably been logged in BI. This provides information about the error.
    System Response
    "Caller 70" is missing.
    ST22 dump analysis is as below:
    Termination occurred in the ABAP program "GP476CZYBEF2WX53UZ8TXFG6XOS" - in                  
    "VALUE_TO_SID_CONVERT_DB".                                                                  
    The main program was "RSMO1_RSM2 ".
    Please help as soon as you can..Production problem....
    Regards
    Rakesh

    Hi rakesh,
    May be IDOCs not processed completely,
    Idoc Problem, Either wait till time out & process Idoc from detail monitor screen, or go to BD87 & process Idoc with status = YELLOW ( be careful while processing IDOCS from BD87, choose only relevant Idocs
    Cheers
    Raj

  • Error while loading data into Customers InfoCube

    Hi All,
    I am receicing the following error while loading the data into the following InfoCube 0SD_C01.
    The error I am receiving is "LIS Updating is Still Active".
    I am also receiving error similar to "The data is being edited at the time of loading the data request".
    Does anybody have encountered such a situation...
    Please help me out.
    Regards
    YJ

    Hi Dinesh and Paolo,
    The links are very useful.
    I went through the Transactions and found out that the Asynchronous Updating is going on for S001.
    So what are the steps that I can take as a BW Consultant so that the data is loaded into BW.
    Regards
    YJ

  • Error while loading data into cube 0calday to 0fiscper (2lis_13_vdcon)

    Hi all,
    I m getting following error while loading the data into cube.
    "Time conversion from 0CALDAY to 0FISCPER (fiscal year V3 ) failed with value 10081031"
    amit shetye

    Hi Amit,
    This conversion problem. Calender not maintained for Fiscal variant "V3", for Year: 1008.
    Maintain calender for year: 1008 and transfer global setting from soruce(R/3).
    RSA1--> Source systems --> from context menu --> transfer global settings > choose fiscal year variants and calender> execute
    Hope it Helps
    Srini

  • Runtime error while loading data into infocube

    Hi all,
    I am getting a runtime error while loading the transaction data into the InfoCube.
    Exception name: CX_SY_GENERATE_SUBPOOL_FULL
    Runtime error: GENERATE_SUBPOOL_DIR_FULL
    Anyone has any idea about this error? Any help will be really helpful. Thanks in advance.
    Regards,
    Sathesh

    Hi Sathesh,
    Please provide more details for the problem.
    You can go and check in ST22. Go through the Dump Analysis and Let us know.
    Regards
    Hemant Khemani

  • Error while loading data from DS into cube

    Hello All
    I am getting the below error while loading data from Data source to the cube.
    Record 1 :Time conversion from 0CALDAY to 0FISCPER (fiscal year ) failed with value 20070331
    Wht am i suppose to do?
    Need ur input in this regard.
    Regards
    Rohit

    Hi
    Simply map 0calday to 0fiscper(you already done it)..You might have forgotten to map fiscal year variant in update rules....if it comes from souce,just map it..if it won't comes from source...make it as constant in update rules and give value..
    If your business is April to March make it as  'V3'
    If your business is January to Decemeber make it as 'K4'..
    activate your update rules delete old data and upload again....
    Hope it helps
    Thanks
    Teja

Maybe you are looking for

  • How do I find my username on my eprint printer to log into it?

    I am trying to set up an eprint account for a Laserjet P1606dn I put a passord on the printer. I know the password, but it prompts me for a username. I don't even remember defining a username, and have tried everything I can think of, without success

  • Iphone 3g problem - please help

    My iPhone 3g was fine until this morning, when it stopped working.  The sleep/wake button doesn't do anything, and when not connected to a power source the apple logo continually flashes on and off - on for about 4 secs, off for about 8 secs. Works w

  • A tough performance problem after upgrade to 10g

    A stored procedure is running fine (returns in 2 min) against SQLPLUS. But when it is called from the application interface, it will take 40 min to return. I have traced the session and found there is one query takes most of the time waiting on 'db f

  • Statistical Setup: Material Movements

    I am about to fill the setup table for material movements.  I plan to fill the setup table with all data but will do so by using posting date range to get smaller sets of data. Can I use the same name for all of the runs and if so do I need to make s

  • Room content search in Collaboration not working

    Hi, I have created one collaboration room, it has documents and discussions. I have configured the indexing for the room. Now i am using the Search Room iview in collaboration for those particular room to search the discussion forum, it is working fi