BW - Using up memory when loading full files

Each night we have a number of jobs which load our data into BW. Most of the loads as full data loads, and therefore the system should wipe out the data that was there previously. However it seems to be keeping the information in the database. Not duplicating the ifnormation but using up the memory. I think we may have had a program which used to delete this out. Anyone any idea what this would be?
We are on old version 2.1C

Wendy,
It is more of a maintenance task. COuple of things you can do.
1. If you load data to targets also loading them thru PSA, after the load you can delete the PSA, say 'xxx' days old data be deleted from PSA. This will improve performance.
2. When loading a full load via a single info package to multiple targets at the same time, this load will create memeory bottle necks, you could avoid this by using a staging area, such as load to a DSO and from there load into further targets using individual info packages. Though  the data same amount of data is in the memeory it handles and performs better.
3. Where possible try to use 'Delta' if youe extractor supports it.
Hope this helps, award points if useful.
Alex(Arthur Samson)

Similar Messages

  • When opening files "Not enough memory to load TIFF file."

    Mac just had a new build of OSX 10.6.8.
    Total memory 4gb of which over 3gb is free.
    Open any image gives a memory error "Not enough memory to load TIFF file"

    Open the linked tiff file(s) in photoshop, and resave with LZW Compression (OFF I Believe). I know you can place an lzw tiff adn usually works fine, but I believe i had this in the past and turned it off/chanegd the RLE settings and Illustrator was made happy. Also look for if the image is  photoshop >> image >> 8 bits/channel.

  • Photoshop crashes when loading a file or image

    Using Photoshop cs3 on a PC with Windows 7 installed.
    Major Problem: Photoshop gives no response when opening an image of any sort or psd file which ends out with the options to either close the program or close and find a solution online which gives no solution whatsoever.
    PScs3 was working just fine when installed but after installing an NVIDIA plug-in did it start bearing some problems. When loading png files, a window normally appears telling me that Photoshop could not parse the file. After seeing this I went to uninstall the plug-in. The problem seemed to went away but shortly after was Photoshop crashing at random, and I do mean random. Starting Photoshop at first seems to nullify the chances of crashing but after either a short period of time or a long period of time will it crash when trying to open a file. Opening multiple files at once doesn't work either but when opening only one file into PScs3 will it cooperate until opening a second. I'm not sure if the plug-in is what caused the problem but I do know everything from there seems hectic. Every crash leads me opening PScs3 again with all of its settings reset from when I first exited normally.
    Plugin Used:
    Photoshop_Plugins_8.23.1101.1715
    Appications Type; 4,317 KB
    Company - InstallShield Softwere Corporation
    Have you tried uninstalling and reinstalling the program?
    Unfortunetly yes I have attempted this and the problem still occurs towards cs3. This makes me fear that the entire program is glitched.
    Do you have any other plug-ins installed?
    I've checked and there are no other plug-ins attatched.
    This is my last option before seeing to reformat my computer. If there is any input about the topic, now is the time.

    Just a hunch: change your default printer to something else (a locally attached printer, not networked) or "PrintToPDF", then relaunch Photoshop and try again.
    It could also be a plugin (try disabling ALL third party plugins).
    And it could be a bad OS install, a virus, etc.  It may take a while to find.

  • Plz tell me which sound file format uses minimum memory of a director file

    plz tell me which sound file format uses minimum memory of a director file. bcoz on adding sound the projector file becomes a large memory file .
    so i am confused that which file format should be used.

    saramultimedia,
    Are you certain that you're asking your question in the right place? This forum is about the Adobe Media Encoder application. Most of the people who answer questions here are familiar with the digital video and audio applications such as After Effects and Premiere Pro.
    If you have a question about Director, it's probably best to ask on the Director forum. But I see that you already know that, since you already have a thread on that forum to ask this question, and it got an answer:
    http://forums.adobe.com/thread/769569

  • My 64Bit Adobe CS6 hangs when loading a file

    My Windows 7 Adobe CS6 v13.0.1 x64 hangs when loading a file. However the CS6 v13.0.1 x32 application runs correctly. Do I need to re-install Adobe CS6 Upgarde?

    Hi there! Because the forum you originally posted in is for beginners trying to learn the basics of Photoshop, I moved your question to the Photoshop General Discussion forum, where you'll get more specialized help.
    To help others help you, please read through this article and provide any additional relevant details.

  • "Memory effect" when loading .xls file information using PropertyLoader

    I have a TestStand 3.1 application, in which the sequence start off loading a number of configuration settings embedded in different .xls files using the PropertyLoader.
    Unfortunately, Teststand sometimes loads the previously used .xls files (same name, located elsewhere), rather than those it were supposed to. In particular, if a .xls file is missing, Teststand will often (always?) load a previously used file with the same name, but located elsewhere. VERY inconvinient when testing ...!
    Is there any way to remove this unfortunate "memory effect"?

    Where are your sequence files located? If the .xls files are relative to them you might want to use a more fully specified relative path to the files (for example: bin\config\filename.xls) rather than just filename.xls. Then becareful to remove search directories (especially recursive ones) that you might have added to find these files. It's very easy to get into problems with recursive search directories or by adding too many search directories if you have lots of files with the same names, by instead using paths relative to the sequence file you can avoid the need to add search directories in many cases.
    Hope this helps,
    -Doug

  • Problem when loading xml file using sql loader

    I am trying to load data into table test_xml (xmldata XMLType)
    i have an xml file and i want whole file to be loaded into a single column
    when i use the following control file and executed from command prompt as follows
    sqlldr $1@$TWO_TASK control=$XXTOP/bin/LOAD_XML.ctl direct=true;:
    LOAD DATA
    INFILE *
    TRUNCATE INTO TABLE test_xml
    xmltype(xmldata)
    FIELDS
    ext_fname filler char(100),
    xmldata LOBFILE (ext_fname) TERMINATED BY EOF
    BEGIN DATA
    /u01/APPL/apps/apps_st/appl/xxtop/12.0.0/bin/file.xml
    the file is being loaded into table perfectly.
    unfortunatley i cant hardcode file name as file name will be changed dynamically.
    so i removed the block
    BEGIN DATA
    /u01/APPL/apps/apps_st/appl/xxtop/12.0.0/bin/file.xml
    from control file and tried to execute by giving file path from command line as follows
    sqlldr $1@$TWO_TASK control=$XXTOP/bin/LOAD_XML.ctl data=/u01/APPL/apps/apps_st/appl/xxtop/12.0.0/bin/file.xml direct=true;
    but strangely it's trying to load each line of xml file into table instead of whole file
    Please find the log of the program with error
    Loading of XML through SQL*Loader Starts
    SQL*Loader-502: unable to open data file '<?xml version="1.0"?>' for field XMLDATA table TEST_XML
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file '<Root>' for field XMLDATA table TEST_XML
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file '<ScriptFileType>' for field XMLDATA table TEST_XML
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file '<Type>Forms</Type>' for field XMLDATA table TEST_XML
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file '</ScriptFileType>' for field XMLDATA table TEST_XML
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file '<ScriptFileType>' for field XMLDATA table TEST_XML
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file '<Type>PLL</Type>' for field XMLDATA table TEST_XML
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file '</ScriptFileType>' for field XMLDATA table TEST_XML
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: No such file or directory
    SQL*Loader-502: unable to open data file '<ScriptFileType>' for field XMLDATA table TEST_XML
    please help me how can i load full xml into single column using command line without hardcoding in control file
    Edited by: 907010 on Jan 10, 2012 2:24 AM

    but strangely it's trying to load each line of xml file into table instead of whole fileNothing strange, the data parameter specifies the file containing the data to load.
    If you use the XML filename here, the control file will try to interpret each line of the XML as being separate file paths.
    The traditional approach to this is to have the filename stored in another file, say filelist.txt and use, for example :
    echo "/u01/APPL/apps/apps_st/appl/xxtop/12.0.0/bin/file.xml" > filelist.txt
    sqlldr $1@$TWO_TASK control=$XXTOP/bin/LOAD_XML.ctl data=filelist.txt direct=true;

  • When using  Vikas' program to load large files  getting error

    Hello,
    I am using Vikas' program to load large data files: http://htmldb.oracle.com/pls/otn/f?p=38131:1
    This works fine, except when I click on the button to create table, then I get a "not found" error--
    failed to parse SQL query:
    ORA-00942: table or view does not exist
    What might cause this? I've checked grants and such and reviewed the code, but haven't figured it out...
    Thanks!

    Hello,
    I am using Vikas' program to load large data files: http://htmldb.oracle.com/pls/otn/f?p=38131:1
    This works fine, except when I click on the button to create table, then I get a "not found" error--
    failed to parse SQL query:
    ORA-00942: table or view does not exist
    What might cause this? I've checked grants and such and reviewed the code, but haven't figured it out...
    Thanks!

  • Why does my mac use virtual memory when I still have free physical memory?

    I have a 2011 i7 quad core mac, I was hoping it would scream. Most of the time it does. However when trying to edit within FCPX I get a very disappointing experience with many pauses and pin wheels if I don't close every single other program.
    I have 8GB of physical memory and when i'm experiencing these problems I see that i still have 1-2 gb of physical memory free or inactive. At the same time FCPX is only using 2gb of memory. I just happened to keep an eye on the VM page in/outs and noticed them going up.
    Right now i'm doing some browsing and emailing, that's about it.. its sat with over 4gb of memory free or inactive and yet still the page in/outs is still going up occasionally. It's currently at over 2 million page ins, and over 1 million page outs.
    So with so much physical memory free why is this happening!? At the moment the mac feels nice and responsive, but if i start trying to use FCPX i'll start to experience these slowdowns, stalls... whenever i see these i see my main hdd is being accessed whilst the pinwheel is displayed.. I mean i get it, its VM, the hdd is too full, a bit fragmented perhaps, its stalling... but i've got gigs of memory sitting free or inactive... why wont the OS use it!!!
    Would my experience improve if i took the plunge and got 16gb of memory instead of 8gb!?
    Thanks for your help!

    Because without virtual memory, managing computer RAM is a royal pain in the ...
    Virtual memory cost you nothing, and gains you huge benefits, even if you do not notice it
    What cost you is when you need more real RAM than is available, and things are thown out of RAM, either back to the original file it came from (Read Only information), or pushed out to the swapfiles (/var/vm/*).  Then the system has to wait for slower disk access.  But even this is better than not being able to run the apps until you quit something else.
    (speaking as someone that starting his professional life working with 1" punch paper tape, 80 columns cards, 7-track and 9-track mag tapes, 1MB disks (you heard me right 1 Megabyte), etc..., and trust me when I tell you that virtual memory is a god send to software development).
    There are a lot of problems running a modern operating system with out virtual memory.  For example all the shared libraries and frameworks that provide services to an application would all need to be compiled into the application, which means every application gets bigger and instead of having a single copy of the shared library or framework, you would have dozens of copies wasting your RAM.
    Without virtual memory, you would be required to find a contiguous chunk of RAM to run your application.  Think of this like going out to dinner by yourself, you can find any available table, but if you go to dinner with your extended family, you need a table for 10 to 15, and if you are going to dinner with your high school graduation class, you will need hundreds of seats all next to each other and a very large table.  In the later situations you have to wait until the resturante has enough contiguous space, which means you have to wait until other diners finish.  There may be lots of empty tables, but they are not together, and your group wants/needs to sit together.  Virtual memory allows gathering any 4K chunk of RAM, building a virtual memory map for all those random 4K chunks, and make it look like one big contiguous chunk of RAM, so you can run your application right away, no waiting.
    Going back to shared libraries and frameworks.  This code will need to have addresses resolved so they branch to the correct locations during execution, and it will need to have addresses resolved on where its program variables are located in RAM.  Using virtual memory, you can local a shared object into RAM, then place it in everyone's virtual memory map at the exact same RAM address.  This means everyone can use the exact same code, and since everyone is using it at the same RAM address, it makes life so much easier for the operating system (translation, less work, less wasted CPU time, faster execution).
    When a program wants to grow, for example a web browser loading a web page (and its images) into RAM, it needs to allocate additional RAM.  In the contiguous RAM model, you need to get control of the RAM that imediately following your program, but if that RAM is being used by someone else, you have to wait until that program goes away.
    Virtual memory provides protection from another program looking at and modifying your program's RAM.  Malware would just love for virtual memory to go away.
    You want virtual memory.  What you do not want is excessive paging activity.
    If you are concerned, then you can launch Applicaitons -> Utilities -> Terminal.  Once you have a terminal command prompt, enter the following command:
    sar -g 60 100
    which will tell you the number of 4k pages written to /var/vm/pagefile ever minute for 100 minutes (modify the numbers to suit your tastes).  You can then go about your normal usage, and come back later to see how much you have been using the pagefiles.  If you have mostly zeros, and an occasional small burst, this is noise, and not worth worrying about.  If you have sustained pageout activity, with higher numbers, then you should either consider running less things all at the same time, or looking for an application that is being greedy with its memory use (or has a memory leak), OR get more RAM for your Mac if you need to do all those things at once.
    But do not complain about virtual memory.  Life would be much worse without it.  Then again if you have a better idea, write a research paper, and get operating system vendors (as well as hardware vendors) to implement your ideas.  I am serious, as I've seen many accepted computing ideas be overturned by good new ideas.

  • Server goes out of memory when annotating TIFF File. Help with Tiled Images

    I am new to JAI and have a problem with the system going out of memory
    Objective:
    1)Load up a TIFF file (each approx 5- 8 MB when compressed with CCITT.6 compression)
    2)Annotate image (consider it as a simple drawString with the Graphics2D object of the RenderedImage)
    3)Send it to the servlet outputStream
    Problem:
    Server goes out of memory when 5 threads try to access it concurrently
    Runtime conditions:
    VM param set to -Xmx1024m
    Observation
    Writing the files takes a lot of time when compared to reading the files
    Some more information
    1)I need to do the annotating at a pre-defined specific positions on the images(ex: in the first quadrant, or may be in the second quadrant).
    2)I know that using the TiledImage class its possible to load up a portion of the image and process it.
    Things I need help with:
    I do not know how to send the whole file back to servlet output stream after annotating a tile of the image.
    If write the tiled image back to a file, or to the outputstream, it gives me only the portion of the tile I read in and watermarked, not the whole image file
    I have attached the code I use when I load up the whole image
    Could somebody please help with the TiledImage solution?
    Thx
    public void annotateFile(File file, String wText, OutputStream out, AnnotationParameter param) throws Throwable {
    ImageReader imgReader = null;
    ImageWriter imgWriter = null;
    TiledImage in_image = null, out_image = null;
    IIOMetadata metadata = null;
    ImageOutputStream ios = null;
    try {
    Iterator readIter = ImageIO.getImageReadersBySuffix("tif");
    imgReader = (ImageReader) readIter.next();
    imgReader.setInput(ImageIO.createImageInputStream(file));
    metadata = imgReader.getImageMetadata(0);
    in_image = new TiledImage(JAI.create("fileload", file.getPath()), true);
    System.out.println("Image Read!");
    Annotater annotater = new Annotater(in_image);
    out_image = annotater.annotate(wText, param);
    Iterator writeIter = ImageIO.getImageWritersBySuffix("tif");
    if (writeIter.hasNext()) {
    imgWriter = (ImageWriter) writeIter.next();
    ios = ImageIO.createImageOutputStream(out);
    imgWriter.setOutput(ios);
    ImageWriteParam iwparam = imgWriter.getDefaultWriteParam();
    if (iwparam instanceof TIFFImageWriteParam) {
    iwparam.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
    TIFFDirectory dir = (TIFFDirectory) out_image.getProperty("tiff_directory");
    double compressionParam = dir.getFieldAsDouble(BaselineTIFFTagSet.TAG_COMPRESSION);
    setTIFFCompression(iwparam, (int) compressionParam);
    else {
    iwparam.setCompressionMode(ImageWriteParam.MODE_COPY_FROM_METADATA);
    System.out.println("Trying to write Image ....");
    imgWriter.write(null, new IIOImage(out_image, null, metadata), iwparam);
    System.out.println("Image written....");
    finally {
    if (imgWriter != null)
    imgWriter.dispose();
    if (imgReader != null)
    imgReader.dispose();
    if (ios != null) {
    ios.flush();
    ios.close();
    }

    user8684061 wrote:
    U are right, SGA is too large for my server.
    I guess oracle set SGA automaticlly while i choose default installion , but ,why SGA would be so big? Is oracle not smart enough ?Default database configuration is going to reserve 40% of physical memory for SGA for an instance, which you as a user can always change. I don't see anything wrong with that to say Oracle is not smart.
    If i don't disincrease SGA, but increase max-shm-memory, would it work?This needs support from the CPU architecture (32 bit or 64 bit) and the kernel as well. Read more about the huge pages.

  • 4.2.3/.4 Data load wizard - slow when loading large files

    Hi,
    I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
    The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
    While loading data, these configuration leads to the execution of the following query for each row:
    select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
    which can be found in the v$sql view while loading.
    It makes the loading process slow, because of the upper function no index can be used.
    It seems that the setting of "case sensitive" is not evaluated.
    Dropping the numeric index for the primary key and using a function based index does not help.
    Explain plan shows an implicit "to_char" conversion:
    UPPER(TO_CHAR(PK)=UPPER(:UK_1)
    This is missing in the query but maybe it is necessary for the function based index to work.
    Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
    Best regards
    Klaus

    Nevertheless, a bulk loading process is what I really like to have as part of the wizard.
    If all of the CSV files are identical:
    use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
    create a VIEW on the collection (makes it easier elsewhere)
    create a procedure (in a Package) to bulk process it.
    The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
    MK

  • Error 1 when loading flat file in BW 7.0

    Hi,
        The flat file structure is same as the transfer structure. Its a csv file and i also checked about the delimiters and stuff.The flat is not open and it is closed while i am loading it. The same file gets loaded if i try in another laptop with my id.If i use my colleague's id on my system also...it doest work...so, the basic problem is with my laptop. I know its nor related to type of data or transfer structure. Its some settings on my laptop which got changed automatically. If i install some other softwares like mozilla firefox or yahoo msg-will that create a problem? I am not at all understanding why its like this. Please help.The error msgs i get when i try to load the flat file -
    Error 1 when loading external data
    Diagnosis
    Error number 1 occurred when loading external data:
    1. Error when reading the file (access rights, file name, ...)
    2. File size or number of records does not correspond to the data in the control file
    3. Error when generating the IDoc
    4. File contains invalid data (errors with an arithmetic operation or data conversion)
    Procedure
    Check whether you have the required access rights and whether the data in the control file is correct (file names, record length, number of records, ...). Correct the data in the control file if necessary and check the data file for invalid data (values of the wrong type, values in the wrong format for conversion exit,...). Check whether the file has headers that have not been specified.
    Error when opening the data file C:\vikki1.csv (origin C)
    Message no.
    Diagnosis
    File C:\ vikki1.csv (origin C) could not be opened.
    Origin:
    A : Application server
    C : Client workstation
    Procedure
    Check whether the file entered exists and is not been used by other applications.

    Hi! Vikki,
    Error 1 means your flat file is open while uploading the data..
    your flat file should be closed while uploading data in BW.
    that is why it is saying "Error when opening the file..".
    first close that file n then upload..it will work.
    rest of the things are ok!..
    I hope this will help you.
    Regards,
    khyati.

  • Error when loading security file .sec

    Hi
    I am getting error when tried to load security file
    Below  is extract of security file
    !FILE_FORMAT=2.0
    !VERSION=11.12
    !USERS_AND_GROUPS
    Praveen@Native Directory
    admin@Native Directory
    !SECURITY_CLASSES
    [Default]
    123
    !ROLE_ACCESS
    Provisioning Manager;Praveen@Native Directory
    Journals Administrator;Praveen@Native Directory
    Advanced User;Praveen@Native Directory
    Data Form Write Back from Excel;Praveen@Native Directory
    Inter-Company Transaction Admin;Praveen@Native Directory
    Provisioning Manager;admin@Native Directory
    Application Administrator;admin@Native Directory
    !SECURITY_CLASS_ACCESS
    123;Praveen@Native Directory;All;N
    i want to add a user and wan to give provision in the above security  file .sec (without doing it in Sharedservices )
    for the above security file i have added balaji and loaded it to HFM its not working.
    !FILE_FORMAT=2.0
    !VERSION=11.12
    !USERS_AND_GROUPS
    Praveen@Native Directory
    balaji@Native Directory
    admin@Native Directory
    !SECURITY_CLASSES
    [Default]
    123
    !ROLE_ACCESS
    Provisioning Manager;Praveen@Native Directory
    Journals Administrator;Praveen@Native Directory
    Advanced User;Praveen@Native Directory
    Data Form Write Back from Excel;Praveen@Native Directory
    Inter-Company Transaction Admin;Praveen@Native Directory
    Provisioning Manager;balaji@Native Directory
    Journals Administrator;balaji@Native Directory
    Advanced User;balaji@Native Directory
    Data Form Write Back from Excel;balaji@Native Directory
    Inter-Company Transaction Admin;balaji@Native Directory
    Provisioning Manager;admin@Native Directory
    Application Administrator;admin@Native Directory
    !SECURITY_CLASS_ACCESS
    123;Praveen@Native Directory;All;N
    123;balaji@Native Directory;All;N
    Can we add users and give provision in security file without creating user in shared services  ?
    Thanks

    If you have your application built using MSAD, then you an load the file the way you built it above, however because you're using the NativeDirectory, you need to create the user in the Directory(HSS) first.  Once the user is created, you can then go through and use the .sec method to assign access/roles, but the actual user creation can't be done through a load file.
    One of the easiest reasons to understand why this can't be done is that when you load the .sec file, you're not setting up a password for the user.

  • Using OR with WHEN in .ctl file

    Hello,
    Can I use OR with WHEN clause in .ctl file?
    I am trying to load txt file into oracle table. I used positions.
    INTO TABLE ABC
    WHEN ((814:834)!=BLANKS) OR ((934:938)!=BLANKS)
    DE_CASE POSITION(1:7) NULLIF DE_CASE=BLANKS,
    DE_WHERECODE POSITION(814:814) NULLIF DE_WHERECODE=BLANKS,
    DE_WHERE POSITION(815:831) NULLIF DE_WHERE=BLANKS,
    DE_PDCODE POSITION(832:832) NULLIF DE_PDCODE=BLANKS
    If not what should I do for OR?
    Thank you,
    H.

    My mistake, it should be WHEN not WHERE.
    review the documentation:
    http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch05.htm
    When I asked 'Did you try the import process already?', I meant, did you try your sql loader process with your control file? Are you getting an error while executing the loading process? if you are please post the error and you probably need more info likethe entire control file with sample data
    Edited by: rkhtsen on Mar 26, 2009 1:11 PM

  • HFM APP error when loading security file

    <p><font size="2" color="black">Good Day,</font></p><p> </p><p><font size="2" color="black">I am getting this error whenloading security file in HFM APP.</font></p><p> </p><p><font size="2" color="black">User not found with identity=native://DN=cn=0a3b0f123e6a9b46:1f9eca5f:10f110fd853:-7d24,ou=People,dc=css,dc=hyperion,dc=com?USER</font></p><p> </p><p><font size="2" color="black">File: Version: 9.2.0.0.1077 Line:-1 Error: (-2147216700)(0x800412C4)(User not found with identity =native://DN=cn=0a3b0f123e6a9b46:1f9eca5f:10f110fd853:-7d24,ou=People,dc=css,dc=hyperion,dc=com?USER)File: CHsxDSSecurity.cpp Version: 9.2.0.0.1077 Line: 6859 Error:(-2147216700)(0x800412C4)() File: CHsxSecurity.cpp Version:9.2.0.0.1077 Line: 2307 Error: (-2147216700)(0x800412C4)() File:CHsvSecurityAccess.cpp Version: 9.2.0.0.1077 Line: 3980 Error:(-2147216700)(0x800412C4)() File: CHsvSecurityLoadACM.cpp Version:9.2.0.0.1077 Line: 2864 Error: (-2147216700)(0x800412C4)() File:CHsvSecurityLoadACM.cpp Version: 9.2.0.0.1077 Line: 535 Error:(-2147216700)(0x800412C4)()</font></p><p> </p><p><font size="2" color="black">Anyone seen this before.</font></p><p> </p><p><font size="2" color="black">Thanks</font></p><p> </p><p><font size="2" color="black">Azmat Bhatti</font></p>

    If you have your application built using MSAD, then you an load the file the way you built it above, however because you're using the NativeDirectory, you need to create the user in the Directory(HSS) first.  Once the user is created, you can then go through and use the .sec method to assign access/roles, but the actual user creation can't be done through a load file.
    One of the easiest reasons to understand why this can't be done is that when you load the .sec file, you're not setting up a password for the user.

Maybe you are looking for