JCS: how to load users with csv file into Identity Console

In my Java Cloud Service: When I go to Identity Console I have a function "Load users" under "Manage Users".
How should this CSV file look like?
How can I add roles to the users?
I loaded yesterday a file and I still get the message "Maximum number of simultaneous uploads per identity domain exceeded. No more uploads will be accepted at this time. Select UTF-8 encoded CSV file you wish to upload. Maximum file size is 256 KB."
When will this file be process? Can I cancel this process? How do I get notified about a result?
kind regards
Robert

see http://docs.oracle.com/cloud/131/trial_paid_subscriptions/CSGSG/cloud-manage-user-accounts.htm#BCFDAIJA  Adding a Batch of User Accounts
file has to have a header:
First Name,Last Name,Email,User Login

Similar Messages

  • Loading data from .csv file into existing table

    Hi,
    I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
    I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
    timesheet_entry_id,time_worked,timesheet_date,project_key .
    The csv columns are :
    project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
    What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
    Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
    Thanks,
    Anjali

    Hi Anjali,
    Take a look at these threads which might outline different ways to do it -
    File Browse, File Upload
    Loading CSV file using external table
    Loading a CSV file into a table
    you can create hidden items in the page to validate previous records before insert data.
    Hope this helps,
    M Tajuddin
    http://tajuddin.whitepagesbd.com

  • How do I call two csv files into ews?

    I have to call two csv files into a script with ews:
    Param([string]$calInputFile = "C:\Algemene agenda\Data docenten.txt" )
    Param([string]$calMailboxFile ="C:\Algemene agenda\Alle docenten.txt"
    # Load all Entries
    $calagenda = Import-Csv $calInputFile
    $calMailbox = Import-Csv $callMailboxfile
    This do not work.
    How can I do this???

    OH!!  Why didn't I see this before?  s-:  You are having problems with the Param definition, but it appears you are having troubles pulling the data into the script.  You can only have a single Param definition for a script.  Try
    the following instead:
    Param(
        [string]$calInputFile
    = "C:\Algemene agenda\Data docenten.txt",
        [string]$calMailboxFile
    ="C:\Algemene agenda\Alle docenten.txt"
    ## Load all Entries
    $calagenda = Import-Csv $calInputFile
    $calMailbox = Import-Csv $callMailboxfile
    To define multiple parameters, you have your Param definition (with its parentheses) and all parameters are defined inside it, separated by commas.

  • Loading data from .csv file into Oracle Table

    Hi,
    I have a requirement where I need to populate data from .csv file into oracle table.
    Is there any mechanism so that i can follow the same?
    Any help will be fruitful.
    Thanks and regards

    You can use Sql Loader or External tables for your requirement
    Missed Karthick's post ...alredy there :)
    Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM

  • How to upload above20000 records csv files into oracle table,Oracle APEX3.2

    Can any one help me how to CSV upload more than 20,000 records using APEX 3.2 upload process.i am using regular upload process using BOLB file
    SELECT blob_content,id,filename into v_blob_data,v_file_id,v_file_name
    FROM apex_application_files
    WHERE last_updated = (select max(last_updated)
    from apex_application_files WHERE UPDATED_BY = :APP_USER)
    AND id = (select max(id) from apex_application_files where updated_by = :APP_USER);
    I tried to upload but my page getting time out. my application best working up to 1000 records. after that its getting timed out.Each record is storing 2 secornds in the oracle table.So 1000 records it taking 7 minuts after that APEX upload webpage getting timed out
    please help me with source how to speed upload csv file process or help another best with with source example.
    Thanks,
    Sant.
    Edited by: 994152 on Mar 15, 2013 5:38 AM

    See this posting:
    Internet Explorer Cannot Display
    There, I provided a couple of links on this particular issue. You need to change the timeout on your application server.
    Denes Kubicek
    http://deneskubicek.blogspot.com/
    http://www.apress.com/9781430235125
    http://apex.oracle.com/pls/apex/f?p=31517:1
    http://www.amazon.de/Oracle-APEX-XE-Praxis/dp/3826655494
    -------------------------------------------------------------------

  • How to import data from CSV file into a table by using oracle forms

    Hi,
    I have a CSV file and i want to insert in oracle database in a table by using a button in oracle forms.
    the user can select CSV file by using open dialog .
    can any one help me to find method to make import and select file from the client machine ?
    thx.

    1. create table blob
    CREATE TABLE IB_LOVE
      DOC          BLOB,
      CONTRACT_NO  VARCHAR2(20 BYTE)                NOT NULL
    )2. use the code below to insert:
           INSERT INTO ordmgmt.ib_love
                       (contract_no, doc
                VALUES (:control.contract_no_input, NULL
           lb$result :=
             webutil_file_transfer.client_to_db (:b2.file_name, v_file_blob_name, v_col_blob_name,
                                                 'CONTRACT_NO = ' || :control.contract_no_input);
           :SYSTEM.message_level := 25;
           COMMIT;
           :SYSTEM.message_level := 0;3. use the code below to download
         if :control.CONTRACT_NO_INPUT is not null then
              vboolean :=   webutil_file_transfer.DB_To_Client_With_Progress(
                               vfilename,  --filename                       
                              'IB_LOVE', ---table of Blob item                       
                              'DOC',  --Blob column name                       
                              'CONTRACT_NO = ' || :CONTROL.CONTRACT_NO_INPUT, ---where clause to retrieve the record                       
                              'Downloading from Database', --Progress Bar title                       
                              'Wait to Complete'); --Progress bar subtitle  client_host('cmd /c start '||vfilename);
              client_host('cmd /c start '||vfilename);                 
         else
              errmsg('Please choose contract no');
         end if;4. use the code below to open file dialog
    x:= WEBUTIL_FILE.FILE_OPEN_DIALOG ( 'c:\', '*.gif', '|*.gif|*.gif|', 'My Open Window' ) ;
    :b2.FILE_NAME:=X;

  • How do I import a CSV file into numbers so the columns show

    Thanks for the help.  I have a CSV file generated by the Google Keyword tool; when I open it in numbers all the values go into the same row.  I get the correct number of rows but only one column with five values in each one.  I read on here to check and see if the extension is correct and it is csv.  I've also dragged the file onto the number icon in the dock and get the same result.  Google gives you the option of exporting as a tsv file. Did that and I got the exact same result.  I've looked at the CSV file in textedit and it doesn't show any commas, but seems to have tabs.  But the tsv file doesn't open correctly either.  I'm at a loss. 

    If the file is created as TSV one, its name extension must be .txt.
    Numbers open such files flawlessly.
    Yvan KOENIG (VALLAURIS, France) mardi 4 octobre 2011 11:53:27
    iMac 21”5, i7, 2.8 GHz, 4 Gbytes, 1 Tbytes, mac OS X 10.6.8 and 10.7.0
    My iDisk is : <http://public.me.com/koenigyvan>
    Please : Search for questions similar to your own before submitting them to the community

  • Does anybody know how Oracle load large N-Triple file into Oracle 11g R1?

    Does anybody know how Oracle load large N-Triple(NT) file into Oracle 11g R1 by using sql*loader according to their benchmark results?
    Their benchmark results indicate they have over come the large data set problem.
    http://www.oracle.com/technology/tech/semantic_technologies/htdocs/performance.html
    It means they have loaded LUBM 8000(1.068 Billion+ Triples) into Oracle successfully, but there is no detailed steps provided. For instance, 32-bit or 64-bit platform they used, only one NT file being used corresponding to one dataset or several NT files?
    Is there any exception occured during the loading process if the NT file beyond 60GB? When using jena to generate NT file against LUBM(8000), the size of the NT file would definitely beyond 60GB.
    We are dividing such large NT file into several small ones? Is it a good approach? I'm hesitating to do so!

    A Linux 32-bit platform was used for bulk-load of LUBM-8000 1.106 billion (before duplicate elimination) RDF triples into Oracle.
    Multiple gzipped N-Triple files were used to hold the LUBM-8000 data. zcat was used on all these files together to send the complete data into a named pipe. SQL*Loader used this named pipe as the input data file to load the data into a staging table in Oracle. Once the staging table was loaded, the sem_apis.bulk_load_from_staging_table API was used to load the data into Oracle Semantic Store.
    (Additional details in http://www.oracle.com/technology/tech/semantic_technologies/htdocs/performance.html )
    Thanks.

  • How to load properties from csv file directly into a .dll parameter in a test step

    Hi,
    I would like to take values from a property file  and plug them directly into parameter used to invoke dll calls.
    I have looked at examples on how to use the property loader to input test limits, but I cannot figure out how to pipe into the params.
    I have tried. <step name>.Module.<param name>
    step.propertieslist[0].
    Any ideas?
    Thanks,
    m
    Solved!
    Go to Solution.

    I agree that it would be nice.  The difference between limits and parameters though is that parameters are set dynamically.  What I mean by that is you never know how many parameters a step will have until you define the module. Whereas the Limits are always set in stone.  The only difference is the MultiNumeric Limit Test which is just an array that get's appended too with the same datatype.  And the only acess to a parameters value is through the API unless it is a variable.
    The other hard part is that you don't know the datatype of parameters.  Any step can have any number of parameters as well as any datatype being passed to each individual parameter. 
    I think if you are set on NOT creating Locals or FileGlobals then you would need to create custom step types with subproperties that get assigned to the parameters.  This could be even more painful though because now you have to maintain and support the custom step types.
    The other option would be to create your own property loader step that could call in and set things in the API.  HMMM now that would be an interesting project...
    jigg
    CTA, CLA
    teststandhelp.com
    ~Will work for kudos and/or BBQ~

  • How to load an AS3 swf file into a AS2 swf file?

    Hi,
    i am trying to display a swf file using AS3 in a flash site which uses AS2. I know that loading the swf file in the normal way, into a dummy movieclip (loadMovie) or into a new level (loadMovieNum), doesn't work. The AS2 based flash site doesn't understand the new file. Is there another way to get around this problem?
    Many thanks
    92 North Ltd
    Leeds website, graphic and web banner design
    Phone 0113 815 1158
    http://www.92north.com

    Thanks kglad. Will give that a try
    92 North Ltd
    Leeds website, graphic and web banner design
    Phone 0113 815 1158
    http://www.92north.com

  • How to read/write .CSV file into CLOB column in a table of Oracle 10g

    I have a requirement which is nothing but a table has two column
    create table emp_data (empid number, report clob)
    Here REPORT column is CLOB data type which used to load the data from the .csv file.
    The requirement here is
    1) How to load data from .CSV file into CLOB column along with empid using DBMS_lob utility
    2) How to read report columns which should return all the columns present in the .CSV file (dynamically because every csv file may have different number of columns) along with the primariy key empid).
    eg: empid report_field1 report_field2
    1 x y
    Any help would be appreciated.

    If I understand you right, you want each row in your table to contain an emp_id and the complete text of a multi-record .csv file.
    It's not clear how you relate emp_id to the appropriate file to be read. Is the emp_id stored in the csv file?
    To read the file, you can use functions from [UTL_FILE|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#BABGGEDF] (as long as the file is in a directory accessible to the Oracle server):
    declare
        lt_report_clob CLOB;
        l_max_line_length integer := 1024;   -- set as high as the longest line in your file
        l_infile UTL_FILE.file_type;
        l_buffer varchar2(1024);
        l_emp_id report_table.emp_id%type := 123; -- not clear where emp_id comes from
        l_filename varchar2(200) := 'my_file_name.csv';   -- get this from somewhere
    begin
       -- open the file; we assume an Oracle directory has already been created
        l_infile := utl_file.fopen('CSV_DIRECTORY', l_filename, 'r', l_max_line_length);
        -- initialise the empty clob
        dbms_lob.createtemporary(lt_report_clob, TRUE, DBMS_LOB.session);
        loop
          begin
             utl_file.get_line(l_infile, l_buffer);
             dbms_lob.append(lt_report_clob, l_buffer);
          exception
             when no_data_found then
                 exit;
          end;
        end loop;
        insert into report_table (emp_id, report)
        values (l_emp_id, lt_report_clob);
        -- free the temporary lob
        dbms_lob.freetemporary(lt_report_clob);
       -- close the file
       UTL_FILE.fclose(l_infile);
    end;This simple line-by-line approach is easy to understand, and gives you an opportunity (if you want) to take each line in the file and transform it (for example, you could transform it into a nested table, or into XML). However it can be rather slow if there are many records in the csv file - the lob_append operation is not particularly efficient. I was able to improve the efficiency by caching the lines in a VARCHAR2 up to a maximum cache size, and only then appending to the LOB - see [three posts on my blog|http://preferisco.blogspot.com/search/label/lob].
    There is at least one other possibility:
    - you could use [DBMS_LOB.loadclobfromfile|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_lob.htm#i998978]. I've not tried this before myself, but I think the procedure is described [here in the 9i docs|http://download.oracle.com/docs/cd/B10501_01/appdev.920/a96591/adl12bfl.htm#879711]. This is likely to be faster than UTL_FILE (because it is all happening in the underlying DBMS_LOB package, possibly in a native way).
    That's all for now. I haven't yet answered your question on how to report data back out of the CLOB. I would like to know how you associate employees with files; what happens if there is > 1 file per employee, etc.
    HTH
    Regards Nigel
    Edited by: nthomas on Mar 2, 2009 11:22 AM - don't forget to fclose the file...

  • By using OID how to unzip the folder and load bunch of .csv files

    Hi All
    bellow is the my scenario.
    in my scenarion we have created FTP folder that will load the .zip files(Every Day).
    .zip files that will contain bunch of .csv file (flat files) and by using OID need to extract the .zip file and load the all .csv files into target database. source data is .csv files and target data is oracle10g. Dose it will possible by using ODI?? please suggest me...
    Thanks in Advance
    Zakeer

    Hi,
    i'm interrested on this way of reading multiple files, i tried to follow those steps but i'm getting an error while running the package:
    ODI-1226: Step vReadFile fails after 1 attempt(s).
    ODI-1236: Error while refreshing variable VOUCHER_CUSTOMER_DATA_INTEGRATION.vReadFile.
    Caused By: java.lang.NumberFormatException: For input string: "#Activity_report_name"
    the variable vReadFile can't recognize the other variable Activity_report_name.
    the refreshing query for vReadFile is :
    select     file     C1_FILE
    from      TABLE
    /*$$SNPS_START_KEYSNP$CRDWG_TABLESNP$CRTABLE_NAME=files_namesSNP$CRLOAD_FILE=C:\/files_name.txtSNP$CRFILE_FORMAT=DSNP$CRFILE_SEP_FIELD=0x0009SNP$CRFILE_SEP_LINE=0x000D0x000ASNP$CRFILE_FIRST_ROW=#Activity_report_nameSNP$CRFILE_ENC_FIELD=SNP$CRFILE_DEC_SEP=SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=fileSNP$CRTYPE_NAME=STRINGSNP$CRORDER=1SNP$CRLENGTH=50SNP$CRPRECISION=50SNP$CRACTION_ON_ERROR=0SNP$CR$$SNPS_END_KEY*/
    any suggestion plz
    regards

  • How to import old iPhone SMS.CSV file into new iPhone?

    Hey everyone,
    So I've spent a couple days on this as I just upgraded to the iPhone5S from my 4S.  Long story short, my backup restore didn't work after multiple tries (gave me the file is corrupt or not compatible with the iPhone error). 
    I tried:
    -reinstalling itunes
    -updating itunes
    -combining the 5S backup with the old 4S backup
    -tinyumbrella
    and a few of other things as I've forgot now.
    The most important things I needed was the contact list and SMS history.  I've got the contact list from extracting is successfully from the iTunes backup folder and using iCloud, but cannot figure out how to import the SMS.CSV file into my iPhone5S.
    Any suggestions on how I can get my SMS messages from my iPhone4s into my 5S?
    If anyone has any other suggestions on getting my Backup Restore to work that would be awesome too!
    Thanks!

    A word on SMS text messages. It is pracitcally impossible to import an SMS text-messages file to a new iPhone and have the Messages app on the new device read the messages.
    You can transfer, however contacts and other individual data (such as calendars and notes) from one iphone to the next. I had my old Nokia which whose contacts I exported to a CSV file. I then imported the file to my iphone which converted the file into multiple contacts in the Contacts app on the iphone. You can use a contact management app such as this one in order to import the contact file to the new iPhone.

  • Question about reading csv file into internal table

    Some one (thanks those nice guys!) in this forum have suggested me to use FM KCD_CSV_FILE_TO_INTERN_CONVERT to read csv file into internal table. However, it can be used to read a local file only.
    I would like to ask how can I read a CSV file into internal table from files in application server?
    I can't simply use SPLIT as there may be comma in the content. e.g.
    "abc","aaa,ab",10,"bbc"
    My expected output:
    abc
    aaa,ab
    10
    bbb
    Thanks again for your help.

    Hi Gundam,
    Try this code. I have made a custom parser to read the details in the record and split them accordingly. I have also tested them with your provided test cases and it work fine.
    OPEN DATASET dsn FOR input IN TEXT MODE ENCODING DEFAULT.
    DO.
    READ DATASET dsn INTO record.
      PERFORM parser USING record.
    ENDDO.
    *DATA str(32) VALUE '"abc",10,"aaa,ab","bbc"'.
    *DATA str(32) VALUE '"abc","aaa,ab",10,"bbc"'.
    *DATA str(32) VALUE '"a,bc","aaaab",10,"bbc"'.
    *DATA str(32) VALUE '"abc","aaa,ab",10,"b,bc"'.
    *DATA str(32) VALUE '"abc","aaaab",10,"bbc"'.
    FORM parser USING str.
    DATA field(12).
    DATA field1(12).
    DATA field2(12).
    DATA field3(12).
    DATA field4(12).
    DATA cnt TYPE i.
    DATA len TYPE i.
    DATA temp TYPE i.
    DATA start TYPE i.
    DATA quote TYPE i.
    DATA rec_cnt TYPE i.
    len = strlen( str ).
    cnt = 0.
    temp = 0.
    rec_cnt = 0.
    DO.
    *  Start at the beginning
      IF start EQ 0.
        "string just ENDED start new one.
        start = 1.
        quote = 0.
        CLEAR field.
      ENDIF.
      IF str+cnt(1) EQ '"'.  "Check for qoutes
        "CHECK IF quotes is already set
        IF quote = 1.
          "Already quotes set
          "Start new field
          start = 0.
          quote = 0.
          CONCATENATE field '"' INTO field.
          IF field IS NOT INITIAL.
            rec_cnt = rec_cnt + 1.
            CONDENSE field.
            IF rec_cnt EQ 1.
              field1 = field.
            ELSEIF rec_cnt EQ 2.
              field2 = field.
            ELSEIF rec_cnt EQ 3.
              field3 = field.
            ELSEIF rec_cnt EQ 4.
              field4 = field.
            ENDIF.
          ENDIF.
    *      WRITE field.
        ELSE.
          "This is the start of quotes
          quote = 1.
        ENDIF.
      ENDIF.
      IF str+cnt(1) EQ ','. "Check end of field
        IF quote EQ 0. "This is not inside quote end of field
          start = 0.
          quote = 0.
          CONDENSE field.
    *      WRITE field.
          IF field IS NOT INITIAL.
            rec_cnt = rec_cnt + 1.
            IF rec_cnt EQ 1.
              field1 = field.
            ELSEIF rec_cnt EQ 2.
              field2 = field.
            ELSEIF rec_cnt EQ 3.
              field3 = field.
            ELSEIF rec_cnt EQ 4.
              field4 = field.
            ENDIF.
          ENDIF.
        ENDIF.
      ENDIF.
      CONCATENATE field str+cnt(1) INTO field.
      cnt = cnt + 1.
      IF cnt GE len.
        EXIT.
      ENDIF.
    ENDDO.
    WRITE: field1, field2, field3, field4.
    ENDFORM.
    Regards,
    Wenceslaus.

  • How to generate a second csv file with different report columns selected?

    Hi. Everybody:
    How to generate a second csv file with different report columns selected?
    The first csv file is easy (report attributes -> report export -> enable CSV output Yes). However, our users demand 2 csv files with different report columns selected to meet their different needs.
    (The users don't want to have one csv file with all report columns included. They just want to get whatever they need directly, no extra columns)
    Thank you for any help!
    MZ

    Hello,
    I'm doing it usually. Typically example would be in the report only the column "FIRST_NAME" and "LAST_NAME" displayed whereas
    in the csv exported with the UTL_FILE the complete address (street, housenumber, additions, zip, town, state ... ) is written, these things are needed e.g. the form letters.
    You do not need another page, just an additional button named e.g. "export_to_csv" on your report page.
    The csv export itself is handled from a plsql procedure "stored procedure" ( I like to have business logic outside of apex) which is invoked by pressing the button "export_to_csv". Of course the stored procedure can handle also parameters
    An example code would be something like
    PROCEDURE srn_brief_mitglieder (
         p_start_mg_nr IN NUMBER,
         p_ende_mg_nr IN NUMBER
    AS
    export_file          UTL_FILE.FILE_TYPE;
    l_line               VARCHAR2(20000);
    l_lfd               NUMBER;
    l_dateiname          VARCHAR2(100);
    l_datum               VARCHAR2(20);
    l_hilfe               VARCHAR2(20);
    CURSOR c1 IS
    SELECT
    MG_NR
    ,TO_CHAR(MG_BEITRITT,'dd.mm.yyyy') AS MG_BEITRITT ,TO_CHAR(MG_AUFNAHME,'dd.mm.yyyy') AS MG_AUFNAHME
    ,MG_ANREDE ,MG_TITEL ,MG_NACHNAME ,MG_VORNAME
    ,MG_STRASSE ,MG_HNR ,MG_ZUSATZ ,MG_PLZ ,MG_ORT
    FROM MITGLIEDER
    WHERE MG_NR >= p_start_mg_nr
    AND MG_NR <= p_ende_mg_nr
    --WHERE ROWNUM < 10
    ORDER BY MG_NR;
    BEGIN
    SELECT TO_CHAR(SYSDATE, 'yyyy_mm_dd' ) INTO l_datum FROM DUAL;
    SELECT TO_CHAR(SYSDATE, 'hh24miss' ) INTO l_hilfe FROM DUAL;
    l_datum := l_datum||'_'||l_hilfe;
    --DBMS_OUTPUT.PUT_LINE ( l_datum);
    l_dateiname := 'SRNBRIEF_MITGLIEDER_'||l_datum||'.CSV';
    --DBMS_OUTPUT.PUT_LINE ( l_dateiname);
    export_file := UTL_FILE.FOPEN('EXPORTDIR', l_dateiname, 'W');
    l_line := '';
    --HEADER
    l_line := '"NR"|"BEITRITT"|"AUFNAHME"|"ANREDE"|"TITEL"|"NACHNAME"|"VORNAME"';
    l_line := l_line||'|"STRASSE"|"HNR"|"ZUSATZ"|"PLZ"|"ORT"';
         UTL_FILE.PUT_LINE(export_file, l_line);
    FOR rec IN c1
    LOOP
         l_line :=  '"'||rec.MG_NR||'"';     
         l_line := l_line||'|"'||rec.MG_BEITRITT||'"|"' ||rec.MG_AUFNAHME||'"';
         l_line := l_line||'|"'||rec.MG_ANREDE||'"|"'||rec.MG_TITEL||'"|"'||rec.MG_NACHNAME||'"|"'||rec.MG_VORNAME||'"';     
         l_line := l_line||'|"'||rec.MG_STRASSE||'"|"'||rec.MG_HNR||'"|"'||rec.MG_ZUSATZ||'"|"'||rec.MG_PLZ||'"|"'||rec.MG_ORT||'"';          
    --     DBMS_OUTPUT.PUT_LINE (l_line);
    -- in datei schreiben
         UTL_FILE.PUT_LINE(export_file, l_line);
    END LOOP;
    UTL_FILE.FCLOSE(export_file);
    END srn_brief_mitglieder;Edited by: wucis on Nov 6, 2011 9:09 AM

Maybe you are looking for