Sun IDM 8.1.1P2: Export user records to xls file Functionality Issue
Hi All,
This is my first post in this form, please guide me to right path.
We implemented custom functionality to search user records from AD and LDAP from IDM User console. After searching the records we provided a export functionality to export resultant user records to xls file.
The issue is the number of exported user records to xls file is not same as the number of user records while search.
This functionality is working good in our Development and VALenvironments but not with the Production environment.
I checked the custom jsp file and the calling Rule from all the three environments and they are same.
From VAL and PROD server.log I see the following.
PWC1406: Servlet.service() for servlet jsp threw exception
java.lang.IllegalStateException: PWC3991: getOutputStream() has already been called for this response
But this error message didn't stop VAL to export same number of records to xls file.
# of records serached from PROD is 16809
# of records serached from VAL is 10312
They are constant all the time.
# of records exported to xls from PROD is 168 or 1274 (It is varying, each time I export it shows different number)
# of records exported to xls from VAL is 10312 (Always same as search)
We are on Glassfish V2.1.1P8.
I checked file sizes from VAL and PROD both are same.
It would be great if any one can point me to the right direction where else I have to check for possible cause.
Thanks,
Ravi Mangalagiri
Hi Arjun,
Thanks for responding to my post.
The search is working as expected in all 3 environments DEV,VAL and PROD.
The search and alignment performed by the Rule where as DB connection and Saving to XLS performed by the custom JSP file.
Since search is working fine I don't think any permissions issue with AD or LDAP.
Couple of things I noticed from server.log from all environments
SEVERE|sun-appserver2.1.1|javax.enterprise.system.container.web|_
ThreadID=297;_ThreadName=httpSSLWorkerThread-9084-102;_RequestID=5efa3ecb-0ec9-4695-ab51-8049257b
9d57;|StandardWrapperValve[jsp]: PWC1406: Servlet.service() for servlet jsp threw exception
java.lang.IllegalStateException: PWC3991: getOutputStream() has already been called for this resp
onse
and
WARNING|sun-appserver2.1.1|javax.enterprise.system.stream.err|_ThreadID=78;_ThreadName=Provisioner;_RequestID=531d32b0-6d9a-4
3e-bd74-0bc9478ffdae;|org.xml.sax.SAXParseException: XML document structures must start and end within the same entity.
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source)
This is logging when the custom jsp is executing.
getOutputStream() has already been called for this response.
I am not sure if this is the root cause, since it is logging in DEV and VAL also.
Other things I noticed are.
Yester day I conducted 10 tests and all are taking 6 min 18 sec or 6 min 19 sec or 6 min 22 sec.
Also I noticed that the number of user records exported to xls depends on the transfer rate.
For example,
if the file download transfer rate is 1.50 KB then the user records are between 1200 to 1800 where as the search user records are 16590.
if the file download transfer rate is 800 B then the user records are between 200 to 600 where as the search user records are 16590.
Not sure where to check this time value(attribute) 6 min 18 sec..
Please provide me some info where else I need to check.
Thanks,
Ravi.
Similar Messages
-
ICM Admin Script to export user variables to log file
Hi,
Is there a easy way to export/report the values of user variables to a file in ICM 7.x?
These are global variables set in admin scripts for open/close flags and looking to see at a glance what the values of all the variables are at a given time throughout the day without having to monitor each script/etc that uses them.
I was looking at maybe an admin script to write the values to a text file every x hours/minutes/etc but am kind of new to ICM scripting and not seeing a step there to do this and not sure if this is possible or if there is another way.
Have a script where supervisors can force open/close different queues for reasons (weather, etc) and trying to add way to log when a change was made for historical purposes and troubleshooting just a text file or something easy.
Thanks, ErickOnly three years later but in case someone else is ever looking this should get you started. First, be sure your vars are set to persistent and replicated to an HDS. You can then extract this data from the Persistent_Variable data. It will require some joins to the User_Variable data to get a nice solid query.
-
How to specify path to read E$ tab records into .xls file using odisqlunlod
Hi
Can any one help me how to specify specific path in odisqlunload tools which is useful for both windows and linux.
I am developing and testing in windows and moving generated scenario into linux box to test for testing people
If in case any error out records are populated in E$ table then how that records will populated on xls file, later I am sending that attachement for email notification to concern people
Below code is present in odisqlunload tool
OdiSqlUnload "-FILE=d:\ODI_Error_Out_Files\Notification_Error_Records.xls" "-DRIVER=oracle.jdbc.OracleDriver" "-URL=jdbc:oracle:thin:@10.75.114.146:1521:POCWCDS" "-USER=wcds" "-PASS=h2yXeih4hFlXXV,QaMeRR2Fy" "-FILE_FORMAT=VARIABLE" "-ROW_SEP=\r\n" "-DATE_FORMAT=yyyy/MM/dd HH:mm:ss" "-CHARSET_ENCODING=ISO8859_1" "-XML_CHARSET_ENCODING=ISO-8859-1"
select * from E$_notification
Please help how to make a single code which is useful and work on both windows and linux
Any sugession willl help me
Thanks in advance
Regards,
PhanikanthHi Bhabani,
I have written below code in KM itself and select technology as Java BeanShall
Code:
<@
String OS = System.getProperty("os.name").toLowerCase();
String v_path="";
if((OS.indexOf("win") >= 0))
v_path="D:\Unload_Dir\<%=snpRef.getSession("SESS_NO")%>.xlsx";
else if (OS.indexOf("mac") >= 0)
v_path="path details";
else if (OS.indexOf("nix") >= 0 || OS.indexOf("nux") >= 0 || OS.indexOf("aix") > 0 )
v_path="/odi_a/Middleware/logs/wcds/odi_logs/<%=snpRef.getSession("SESS_NO")%>.xlsx";
else if (OS.indexOf("sunos") >= 0)
v_path="soliaris path";
@>
OdiSqlUnload "-FILE=<@=v_path@>" "-DRIVER=<%=odiRef.getInfo("DEST_JAVA_DRIVER")%>" "-URL=<%=odiRef.getInfo("DEST_JAVA_URL")%>" "-USER=<%=odiRef.getInfo("DEST_USER_NAME")%>" "-PASS=<%=odiRef.getInfo("DEST_ENCODED_PASS")%>" "-FILE_FORMAT=VARIABLE" "-ROW_SEP=\r\n" "-DATE_FORMAT=yyyy/MM/dd HH:mm:ss" "-CHARSET_ENCODING=ISO8859_1" "-XML_CHARSET_ENCODING=ISO-8859-1"
select * from <%=odiRef.getTable("L","ERR_NAME", "W")%>
It was executed well and below is the Execution code of the above code
Execution Code:
<@
String OS = System.getProperty("os.name").toLowerCase();
String v_path="";
if((OS.indexOf("win") >= 0))
v_path="D:\Unload_Dir\1341360.xlsx";
else if (OS.indexOf("mac") >= 0)
v_path="path details";
else if (OS.indexOf("nix") >= 0 || OS.indexOf("nux") >= 0 || OS.indexOf("aix") > 0 )
v_path="/odi_a/Middleware/logs/wcds/odi_logs/1341360.xlsx";
else if (OS.indexOf("sunos") >= 0)
v_path="soliaris path";
@>
OdiSqlUnload "-FILE=<@=v_path@>" "-DRIVER=oracle.jdbc.OracleDriver" "-URL=jdbc:oracle:thin:@10.75.114.146:1521:POCWCDS" "-USER=wcds" "-PASS=<@=snpRef.getInfo("DEST_ENCODED_PASS") @>" "-FILE_FORMAT=VARIABLE" "-ROW_SEP=\r\n" "-DATE_FORMAT=yyyy/MM/dd HH:mm:ss" "-CHARSET_ENCODING=ISO8859_1" "-XML_CHARSET_ENCODING=ISO-8859-1"
select * from WCDS.E$_CDS_COMPANY
Please confirm me if the above Code is correct or not, if Not, please correct it and DESC_ENCODE_PASS is not encoding the password
Regards
Phanikanth
Edited by: Phanikanth on Feb 18, 2013 1:09 AM -
Export Chinise data in XLS file using toad
Hii..
I have one table which have chinise data
All data is correct .
I am exporting chinese data from Oracle table,I am facing problem
I export data through toad in flat file
and import it in XLS file using UTF character set .
After importing all character gets converted to ???? marks.
I cant see the data properly in the XLS file .
my guess data base is not exporting file as unicode and hence charactger gets converted
How I export data which I show correct data in chinise?Hi
I am using 8.5 version of toad .I tried with sql developer also but no vail..
can I set like this
ALTER SESSION SET NLS_LANGUAGE='SIMPLIFIED CHINESE';
ALTER SESSION SET NLS_TERRITORY='CHINA';
ALTER SESSION SET NLS_CHARACTER_SET='AMERICAN_AMERICA.UTF8';
and my NLS_LANG in windows for 10g client is AMERICAN_AMERICA.WE8MSWIN1252
PLz suggest mi
My NLS_SESSION_PARAMETERS in database are as follows
PARAMETER VALUE
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY $
NLS_COMP BINARY
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CONV_EXCP FALSE -
Help Export TABLE Records into Flat File with INSERTs - error
Hi,
When i'm trying to run this procedure I got this error:
ORA-00932:inconsistent datatypes: expected - got -
Can anybody tell me why?
Thanks
CREATE OR REPLACE PROCEDURE generate_stmt(prm_table_name IN VARCHAR2,
prm_where_clause IN VARCHAR2,
prm_output_folder IN VARCHAR2,
prm_output_file IN VARCHAR2) IS
TYPE ref_cols IS REF CURSOR;
mmy_ref_cols ref_cols;
mmy_column_name VARCHAR2(100);
mmy_column_data_type VARCHAR2(1);
mmy_col_string VARCHAR2(32767);
mmy_query_col_string VARCHAR2(32767);
V_FILE_HNDL UTL_FILE.file_type;
begin
OPEN mmy_ref_cols FOR
SELECT LOWER(column_name) column_name
FROM user_tab_columns
WHERE table_name = UPPER(prm_table_name)
ORDER BY column_id;
LOOP
FETCH mmy_ref_cols
INTO mmy_column_name;
EXIT WHEN mmy_ref_cols%NOTFOUND;
mmy_col_string := mmy_col_string || mmy_column_name || ', ';
mmy_query_col_string := mmy_query_col_string || ' ' || mmy_column_name || ',';
END LOOP;
CLOSE mmy_ref_cols;
V_FILE_HNDL := UTL_FILE.FOPEN('TEST','TESST.TXT', 'W');
mmy_col_string := 'INSERT INTO ' || LOWER(prm_table_name) || ' (' ||
CHR(10) || CHR(9) || CHR(9) || mmy_col_string;
mmy_col_string := RTRIM(mmy_col_string, ', ');
mmy_col_string := mmy_col_string || ')' || CHR(10) || 'VALUES ( ' ||
CHR(9);
mmy_query_col_string := RTRIM(mmy_query_col_string,
' || ' || '''' || ',' || '''' || ' || ');
dbms_output.put_line(mmy_column_name);
OPEN mmy_ref_cols
FOR ' SELECT ' || mmy_query_col_string ||
' FROM ' || prm_table_name ||
' ' || prm_where_clause;
loop
FETCH mmy_ref_cols
INTO mmy_query_col_string;
EXIT WHEN mmy_ref_cols%NOTFOUND;
mmy_query_col_string := mmy_query_col_string || ');';
UTL_FILE.PUT_LINE(V_FILE_HNDL, mmy_col_string);
UTL_FILE.PUT_LINE(V_FILE_HNDL, mmy_query_col_string);
end loop;
end;Buddy,
Try this..
CREATE OR REPLACE PROCEDURE generate_stmt
(prm_table_name IN VARCHAR2,
prm_where_clause IN VARCHAR2,
prm_output_folder IN VARCHAR2,
prm_output_file IN VARCHAR2) IS
TYPE ref_cols IS REF CURSOR;
mmy_ref_cols ref_cols;
mmy_column_name VARCHAR2(100);
mmy_column_data_type VARCHAR2(1);
mmy_col_string VARCHAR2(32767);
mmy_query_col_string VARCHAR2(32767);
V_FILE_HNDL UTL_FILE.file_type;
begin
OPEN mmy_ref_cols FOR
SELECT LOWER(column_name) column_name
FROM user_tab_columns
WHERE table_name = UPPER(prm_table_name)
ORDER BY column_id;
LOOP
FETCH mmy_ref_cols
INTO mmy_column_name;
EXIT WHEN mmy_ref_cols%NOTFOUND;
mmy_col_string := mmy_col_string || mmy_column_name || ', ';
mmy_query_col_string := mmy_query_col_string || ' ' || mmy_column_name || ',';
END LOOP;
CLOSE mmy_ref_cols;
mmy_col_string := 'INSERT INTO ' || LOWER(prm_table_name) || ' (' ||CHR(10) || CHR(9) || CHR(9) || mmy_col_string;
mmy_col_string := RTRIM(mmy_col_string, ', ');
mmy_col_string := mmy_col_string || ')' || CHR(10) || 'VALUES ( ' ||CHR(9);
mmy_query_col_string := RTRIM(mmy_query_col_string,' || ' || '''' || ',' || '''' || ' || ');
V_FILE_HNDL := UTL_FILE.FOPEN('TEST','TESST.TXT', 'W');
OPEN mmy_ref_cols FOR 'SELECT ' || mmy_query_col_string ||' FROM ' || prm_table_name ||' ' || prm_where_clause;
loop
FETCH mmy_ref_cols INTO mmy_query_col_string;
EXIT WHEN mmy_ref_cols%NOTFOUND;
mmy_query_col_string := mmy_query_col_string || ');';
UTL_FILE.PUT_LINE(V_FILE_HNDL, mmy_col_string);
UTL_FILE.PUT_LINE(V_FILE_HNDL, mmy_query_col_string);
end loop;
UTL_FILE.FCLOSE(V_FILE_HNDL);
END;
This would work for table with one and only one column.
Look at the line below:
FETCH mmy_ref_cols INTO mmy_query_col_string;
mmy_query_col_string has been declared as string...So it would hold single value only.That's the reason when you try this block on table with more than one column,mmy_query_col_string would've to hold a table row type data which it would not...
Good luck!!
Bhagat -
Exporting to Media Encoder and file locating issue
Ok, Now I've read a LOT of threads trying to find a solution to this but none of them have answered my problem exactly, so here's what I have:
Running a New computer with a clean install of Windows 7 64 bit and Adobe CS4 Master Suite
Hardware:
Dell Precision M6400
Intel(R) Core(TM)2 Duo CPU T9900 @ 3.06 Ghz, 3.07 Ghz
12.0 GB Ram
Premiere ran fine for a while, but soon I've run into a problem where I try to export to the Media Encoder and I keep getting the error:
"Could not read from the source. Please check if it has moved or been deleted" over and over
I have deleted recently imported media one at a time to try and locate what's causing it since it wasn't doing this before, but no luck. So I restarted the program but now the Premiere project won't open. The task bar loads until about 2/3 in and stops there indefinitely, I've left and came back to it 2 hours later and it hadn't moved. I back tracked and found a previous save that will open, but it still will not export, it still says "Could not read from the source. Please check if it has moved or been deleted". And as soon as I update anything and save a new file that one will not open either.
I've tried recompressing each video file into a format I know works well with Premiere but still no difference.
This problem has plagued my workflow for a while now and I can't take it anymore. I've had to restart clean project files to try to work around it but I lose so much progress, and the work around by rendering the premiere file through After effects hasn't worked either. The error I get there says "Dynamic link server is busy".
Here's what the problem is not: It is not because I uninstalled CS3 or any other previous version, I've never had it on this computer. It's not a plugin problem, I don't have any plugins on this install except for the default ones that Came with Adobe Master Suite for Premiere Pro.
Someone please help me, this is ridiculous that I have had to trouble shoot the program this much. Both CS4 and Windows 7 have been around for over a year now and neither are in Beta EditionsWow, sorry, forgive me if I've seen most forums work with a lot less, and your dismissiveness led me to assume you hadn't read my post.
All of this is new and up do date as far as Adobe and Windows updates go, that's Version 4.2.1.
My projects for this are all in NTSC Widescreen DV 24P, the particular sequence that I render from is in that format as well, although regardless of which one I use the problem keeps occurring.
My renders for my recent purposes have been compressed in .h264, IPOD video Large. The codec I'm rendering in also doesn't seem to matter, because I've tried this in .avi., .mov, .h264, .wav, etc. and all with various settings that I thought might help, but the encoder quits encoding 4 seconds in and gives me the error that it can't find the media.
And I didn't mention the media type and what not because it seems irrelevent of what I use this problem keeps happening. I'm pretty sure it's a problem with the project file or perhaps the media encoder or both.
And I don't know what you mean by "disks"
Do I have a Raid set up on my hard disk? No, I have two 0.5 TB hard drives, Programs on one (C:\) Projects on the other (D:\).
What address am I saving the scratch disks to? I use the default of "same as Project". Have I tried Clearing it? Yes.
I don't think I'm getting at what you're asking.
And I don't think I've ever used an .xlsm (which is excel?) file and I've definitely never used a Power point project (PPS?) file in my projects. And they open in their respective programs, although I'm not sure why you asked about that. Again I'm not sure I know what you mean.
The Premiere Project (.prproj) file is what I'm trying to open when the progress bar stops at 2/3 the way through and forces me to terminate the process with the task manager to close it.
It would help to mention that all my projects are animated and I use premiere to cobble them together, so no problems that relate to transferring media from a camera to the computer would apply here. Most of the files I get out of Maya are in Default .avi or .mov files, and I also use a handful of .wav files for the dialogue and .jpgs for cards and placeholders.
I believe that this problem began when I updated a few of the files in my project, but I've been doing that for weeks without incident. I've been updating them by saving over the old media file with the updated one of the same name, length, and type.
I don't work exclusively with Premiere, in fact it's not what I usually use, that would be Final Cut Pro. -
Export color vs. print to file color issue with 2.3RC
Hi,
I'm having problems with exporting photos for web-site usage.
In short: colors get extremely washed out and plain when using "export to disk" (requesting sRGB color profile, in jpg-format). But, when "printing to file" using sRGB profile and rendering intent perceptual colors are reasonably close to what I see in LR. Yes, I know, sRGB is limited, but...
Why there is such a difference when both methods are using sRGB for generated jpg? Is it the fact that rendering intent can not be selected for "export to disk"?
As I can not control pixel dimensions in Print mode is not an option for my use (and I can not control sharpening with plug-in, but the color issue is there to begin with without any export plugin intervention).
In contrast, LR 1.4 export to disk gives reasonable colors, the color issue has appeared with 2.x series.
Running Windows XP SP3, LR 2.3RC, display calibrated with Color Vision SPyder 2.
Please help, this is driving me nuts :(I think, that your examples shows the same problem as my observations.
It seems that your display has even smaller gamut than sRGB (on my notebook DELL D620 this is also the case) so after you calibrated it, both lightroom and color managed firefox shows more saturated colors than windows explorer showing exported image (as far as I know, export from library module works correctly, but win explorer is not doing conversion to your monitor profile, that's why image is undersaturated). This is correct. It is really hard to calibrate such a low quality display, which can often be found on cheap notebooks - I have my own experience with it. Blue shades are really weak so heavy corrections have to be done in the profile (generated with spyder in your case), which can result in purple skies (why? Read this: http://www.brucelindbloom.com/index.html?WorkingSpaceInfo.html).
Problem is with image printed to file. This should look same as file exported from library module given that you use sRGB both as a destination space in export from library and as a printer profile while printing to file. In your case LR printed your image using values corresponding to your monitor profile (even if sRGB was selected) which resulted in oversaturated image (because your display has smaller gamut than sRGB). In my case images printed to file are undersaturated (because my monitor has gamut wider than sRGB). But I'm sure this is a bug! Your monitor profile should not affect printing to file in any case if sRGB is selected as printer profile!
Summary: purle sky in LR and in color managed FF is ok, but purple sky in image printed to file is not ok. :-)
And my hint: do not try to calibrate this display... :-)
You can try to make some measurements of your UNCALIBRATED NB display using this freeware: http://www.homecinema-fr.com/colorimetre/index_en.php You will probably see a very uneven color temperature across grayscale and much smaller gamut than sRGB.
Let us know your findings. -
Exporting specific columns from XLS file to a new XLS or XLSX file using powershell
The scenario is that i have a file that our server is constantly dumping data to on a daily basis and out of all the data i only need 2 columns. I need to be a able to automate this because i need to generate a monthly report based on those two columns.
That being said, i need to be able to extract two full columns from datafile1.xls to newdatafile.xlsx. If it would be easier if the original file is CSV i can easily save the data in CSV format instead of XLS.
Thanks in advance.I see, im having a hard time executing the script, i keep getting the following error:
'""Microsoft.Jet.OLEDB.4.0"
has not been registered.";'
and
when i try to run it like this i dont get any errors BUT nothing happens:
C:\Windows\syswow64\rundll32.exe "C:\Program Files\Common Files\System\Ole DB\oledb32.dll",c:\temp\t.ps1
This is the code, which to tell you the truth im not really that familiar with :\
$strFileName = "C:\temp\original\styles.xls"
$strSheetName = 'STYLES$'
$strProvider = "Provider=Microsoft.Jet.OLEDB.4.0"
$strDataSource = "Data Source = $strFileName"
$strExtend = "Extended Properties=Excel 8.0"
$strQuery = "Select STYLE CODE , LAST UPDATE from [$strSheetName]"
$objConn = New-Object System.Data.OleDb.OleDbConnection("$strProvider;$strDataSource;$strExtend")
$sqlCommand = New-Object System.Data.OleDb.OleDbCommand($strQuery)
$sqlCommand.Connection = $objConn
$objConn.open()
$DataReader = $sqlCommand.ExecuteReader()
While($DataReader.read())
$ComputerName = $DataReader[0].Tostring()
"Querying $computerName ..."
Get-WmiObject -Class Win32_Bios -computername $ComputerName
$dataReader.close()
$objConn.close()
For both PowerShell or VBScript you are missing the libraries. You need to download and install the ACE drivers.
Search for ACE drivers.
¯\_(ツ)_/¯ -
Provisioning User IDs in Remedy Help Desk with Sun IdM 7.0.
Hi,
Our team is in the process of defining a approach to provision user IDs in Remedy Help Desk system using Sun IdM version 7.0.
What we wanted to know is whether it is possible to use the Remedy resource adapter bundled with Sun IdM 7.0 to provision user IDs. We think that this resource adapter is used to provision help desk tickets into the help desk system and not user IDs. Is the understanding correct?
If user IDs cannot be provisioned using the resource adapter, we are planning the following approach to provision user IDs into Remedy:
1. Understand the table schema of the Remedy database.
2. Configure the Database Table resource adapter to provision into the Remedy user tables.
We are looking for inputs from people who have come across a similar design issues with Remedy Help Desk and could validate our design approach. We will highly appreciate any inputs on this.
Thank You.
Regards,
Vallabh Vengulekar."We think that this resource adapter is used to provision help desk tickets into the help desk system and not user IDs"
hi as per ur post...where did u find this information..I am looking for this information of how to manage Remedy tickets through IDM.
If you can help me it wil be great...looking for your inputs...
thanks in advance. -
Error while importing the user Records through SCC7
Hi Basis Gurus,
We are doing the post DB Refresh Activities for a Sandbox .
Refresh is done from D21 to S21.
We are stuckup with an error while importing the user Records(scc7).
The exported user records request from D21 is dumped in the trans directory of S38 and imported in to S21 using TP commands at O/S level. The import is done successfully( RC 4 with warnings).
But now when we run SCC7 (post import activities) in the background the process is getting cancelled right in the begining.
Error Occured in SCC3 log
Table logging disabled in program RSCLXCOP by user SAP*"
R/3 Version is 4.6C OS:AIX5.3
Your suggestions will be highly appreciated.
Regards,
SitaramHey Sitaram,
I looks like you have incorrect parameters of client copy in the S21 system.
You can change the parameters in S21 system,
by executing report RSCLXCOP (via transaction SE38 or SA38) in S21.
Search parameters that related to table logging/sap*
good luck! -
Anyone has experience with sun idm data exporter /warehouse funtionality ?
Anyone has experience with sun idm data exporter /warehouse functionality. There is not much documentation about how to debug it. I created everything like in the document. Everything seems running fine. I get the following the server tasks->Run Tasks
Data Warehouse Exporter Data Warehouse Exporter Configurator executing
Prior to that i created database and 50 tables as it said in the doc.
I created accounts and modify email address. Nothing is getting to my warehouse database and i don;t know where to look for the errors. Any information is appreciated.Hi there,
I have been looking at the source code and I think I have found the problem.
IDM determines whether to update or create a resource account
by attempting to fetch the user from the resource.
If the user exists then update, otherwise create.
In the code, if the user does not exist, the code throws the
exception: EntityDoesNotExist(1301)
The code then catches this exception
and then returns a null back to IDM,
indicating that the user does not exist.
Well, that is what the code says but this does
not match its actual behaviour....
I then decompiled the actual class (jar) files
and the code there does NOT catch the exception,
so it bubbles up to IDM, which regards it as an error.
Soo, the jar file that is on the website has a bug in it.
The source code in SVN is correct, but it appears
that the jar file was not rebuilt.
I am attempting to rebuild a new version of the jar file...
John I -
Linking a new resouce with user account in Sun IDM via activesync
Hi,
I am having a new resource which contains the user records. Now I want to link that resource to the existing and new users in Sun IDM.
I do not want to update and create user in the new resource. I just need that a link is created in Sun IDM when ever activesync runs on users account.
Please guide me how to achieve the same.
Regards,
NitinI'm afraid I can't share the exact code but it should be straight forward through the following:
1. define a field (call it ldapDN)
2. create a rule to user getResourceObjects to search for the user DN and return the DN to ldapDN.
3. when ldapDN is not null, expand waveset.resources and add your LDAP resource, like:
<Field name='waveset.resources'>
<Expansion>
<append>
<s>LDAP</s>
4. Then set the accountId for that resource as
<Field ........ accounts[LDAP].accountId>
<Default><s>ldapDN</s></Default>
and you should be set... hope it helps. -
Is it possible to install the Sun IDM apache tomcat web server in a different zone than the JBOSS web app container. If so can anyone provide documentation on how to do this
Hi Arjun,
Thanks for responding to my post.
The search is working as expected in all 3 environments DEV,VAL and PROD.
The search and alignment performed by the Rule where as DB connection and Saving to XLS performed by the custom JSP file.
Since search is working fine I don't think any permissions issue with AD or LDAP.
Couple of things I noticed from server.log from all environments
SEVERE|sun-appserver2.1.1|javax.enterprise.system.container.web|_
ThreadID=297;_ThreadName=httpSSLWorkerThread-9084-102;_RequestID=5efa3ecb-0ec9-4695-ab51-8049257b
9d57;|StandardWrapperValve[jsp]: PWC1406: Servlet.service() for servlet jsp threw exception
java.lang.IllegalStateException: PWC3991: getOutputStream() has already been called for this resp
onse
and
WARNING|sun-appserver2.1.1|javax.enterprise.system.stream.err|_ThreadID=78;_ThreadName=Provisioner;_RequestID=531d32b0-6d9a-4
3e-bd74-0bc9478ffdae;|org.xml.sax.SAXParseException: XML document structures must start and end within the same entity.
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source)
This is logging when the custom jsp is executing.
getOutputStream() has already been called for this response.
I am not sure if this is the root cause, since it is logging in DEV and VAL also.
Other things I noticed are.
Yester day I conducted 10 tests and all are taking 6 min 18 sec or 6 min 19 sec or 6 min 22 sec.
Also I noticed that the number of user records exported to xls depends on the transfer rate.
For example,
if the file download transfer rate is 1.50 KB then the user records are between 1200 to 1800 where as the search user records are 16590.
if the file download transfer rate is 800 B then the user records are between 200 to 600 where as the search user records are 16590.
Not sure where to check this time value(attribute) 6 min 18 sec..
Please provide me some info where else I need to check.
Thanks,
Ravi. -
Delete a user record from oracle db
Hi,
Can anybody share me a custom workflow which deletes a user record from the oracle db.The user does exist in SUN IDM.
Please help me.
Thanks in advance.You can use a Trigger and UTL_FILE to write to a file (on the server).
Example:
create or replace trigger test_file
after insert or delete or update on test_case
for each row
declare
v_logfile utl_file.file_type;
begin
v_logfile := utl_file.fopen('\myfiles','test_file.log','a');
if inserting then
utl_file.put_line(v_logfile,'Inserting to table');
elsif deleting then
utl_file.put_line(v_logfile,'Deleting to table');
else
utl_file.put_line(v_logfile,'Updating to table');
end if;
utl_file.fclose(v_logfile);
end test_file;
I want to generate a log file in which i want to dump some useful messages, when anyone does a dml operation on a table, and also, i want to have a switch like YES or NO (may be an environment variable,,) if i switch it to YES the log file should get generated, if NO then, no log file will be generated..
can anyone help, how can you do this task ?
thanks a lot in advance..
srini -
Hi all,
i've a couple of questions to do you about doing tuning to get IDM working better.
- actually i'm using Java version 1.5 update 19. Is it important for tuning doing the update of Java to the last update(java 1.5 update 22)?
- the OS where we have installed Glassfish is Solaris 5.10 without the last patches. Do you thing that updating Solaris with the last patches, the performance of IDM
grow up?
This questions came out from the Sun docs about tuning, that says that doing a good tuning means too, to have an update environment.
Best regardsmadhatter wrote:
IdM manages database transactions when it writes to a repository
database. So if you change 400000 user records one by one, you will
have 400000 commits. I managed to make user creation/update much
faster by packaging all changes to one transaction. But this requires
a bit of code on the (undocumented) data repository level. If you are
not afraid of such kind of coding, I will send you some code snippets.During my tests i found out that IDM implicate an average of 0.129 seconds
to execute a single database transaction. For simple operations on IDM
this is acceptable but for operations that implicit a great mole of users, this
could be a problem(for example, my colleague tryed out to install the last patch for
IDM but this took a long time because the mole of users amount more than 800.000 users,
so he decided to stop without success :( :( ).
Anyway, you can send me your code in this e-mail
address : gentjan81 @ gmail +.com+ . I'll try out your code in the test platform.
Thanks a lot.
Maybe you are looking for
-
Web Start Security and the Cache
Hi, If the jars are signed and download occurs then webstart will verify the signature and tell the user that jars signed by user xxxxx are about to be run. Then the jars are cached by web start. My question is, what is to stop an attacker from repla
-
I want to search for two strings of text throughout numerous cells
I can get a formula to search an individual cell using this: =IFERROR((IF((SEARCH(String1,SearchString,1)+SEARCH("String2",SearchString,1))>0 ,"Yes","No")),"No") How can I search a bunch of cells at the same time? Thanks.
-
Safari Crashes - 10.8.2
Hey any/everyone. Having a bit of an issue here and I saw that multiple people had random applications that had this issue and unfortunately mine was the worst program they could choose... Anyway, whenever I try to open Safari I get this crazy long m
-
Benefits to Upgrading to Extreme
Are there benefits to upgrading to an Extreme Base Station, if the connected computer does not have an Extreme card? I have a G4 Titanium Powerbook, and I have frequent intermitant problems with the airport dropping for anywhere from a second to thir
-
Executing 4 programs parallely via main program in background
I am executing a program in background and that program internally calls 4 other programs. I want to execute those 4 programs at the same time (concurrently). Please suggest. Regards, Rajendra Dhaka