Integrating Flat File data to LDAP Directory using sunopsis driver
Hello
I need to import data from a csv file into a LDAP Directory.
In order to acheive this, i used Demo physical and logical File data server (called FILE_GENERIC) and set up a new LDAP data server using tutorial "Oracle Data Integrator Driver for LDAP - User's Manual".
I can manually see and update data on both file and LDAP datastores.
The fact is that i cannot manage to import/update data from the file to the LDAP directory through a dedicated interface.
The issue do, i think, come from the PK/FK used by sunopsis relational model to represent the directory.
LDAP DN is represented by a set of two table representing in my example the organizational units in one hand and the persons in the other hands, linking them through FK in persons to auto-generated PK in organization units. My person table also have a auto generated PK. All the directory datastore tables have been reversed through ODI.
In my interface, i always use my cn as update key.
I first tried not to map the person PK in the interface, letting the driver generating it for me (or mapping a null PK). I then catch in operator a message like: " null : java.sql.SQLException: Try to insert null into a non-nullable column".
Anyway, the first row is created in the directory and a new PK is given into ODI datastore. Curiously, this is not as i would presume the last PK value + 1.
There are some kinds of gaps in the ID sequences.
I even tried checking the "tolerated error" into the IKM step called "Insert new row". I'm using IKM shipped with ODI :"IKM SQL Incremental Update". The sequence is finished in operator but due, i guess, to the catched error, the other rows are not processed. (Anyway i shouldn't have to tolerate errors)
I tried after to put not used custom PK values into my file, then map the PK column to the LDAP datastore PK column without much success: Only one row is processed. Futhermore, the id of the PK in the datastore is different of the one I put in the file.
I finally tried to generate PK values through SQL instructions by creating new steps in the IKM modul but that did not worked much.
I really do not see any other ideas to either have the driver construct new PK at insert/update or to make him ignore the null PK problem and process all the rows.
If anyone do have an idea about it, please share...
Greetings,
Adrien
Hi,
I am facing an issue who is probably the same.
using ODI 10.1.3.5, I can't insert new rows into my openLDAP.
One of the point I see is that the execution take the LDAP server for staging area and want to create I$ table into it, so the data are already imported into the ldap Server.
thanks for any help.
Similar Messages
-
Flat file data loading error using process chain
Hi SAP Experts,
I am having a problem loading the flat file data to the cube using process chain. The issue is that when i run the process chain it fails giving the message " Date format error, please enter the date in the format _.yyyy" . I am using " 0calmonth in the datasource" . Strange is that when i manually execute the infopackage, i dont get any errors and am able to load the same file successfully to the dso and the cube. Is there any special setting for the process chain that i am missing?
The date format in the flat file is mm/yyyy. I have tried all the options i could including recreating the datasource and invain dont see any success so far. please help me solving this problem as we r in the middle of testing cycle.
Thanking you all for your quick response and support all the time.
Kind Regards,
SanjeevHi Sanjeev,
I believe you are opening the .csv file again after saving it. In this case the initial 0 in single digit month (say 02/2010) is getting changed to 2/2010 and the resulting file is not readable to the system. Just do not open the file after you have entered data in the .csv file and saved and closed it. If required, open the file in notepad, but not in excel in case you want to re-check the data.
Hope this helps.
regards,
biplab -
Loading FLat file data using FDMEE having 1 to many mapping
Hi All,
I need to load a data from Flat file to hyperion planning applcation using FDMEE having one to many mapping
For e.g Data file has 2 records
Acc Actual Version1 Scene1 1000
Acc Actual Version1 Scene2 2000
now target application has 5 dimension and data need to be load as
acc Actual Version1 entity1 Prod2 1000
Acc Actual Version1 Entity2 Prod2 2000
Please suggest
Regards
AnubhavFrom your exmple I don't see the one too many mapping requirement. You have one source data line that maps to a single target intersection. Where is the one to many mapping requirement in your example?
-
How to save HR data in Active Directory using ABAP i.e thru LDAP Connector
Hi All,
Can any one please help me out how
to save HR data in Active directory
using LDAP Connector ?
Please help ASAP as it is very urgent .
Thanks
JitendraThere are 100 of such scripts are there online.
here are few tips and codes. you will get more.
https://gallery.technet.microsoft.com/scriptcenter/Feeding-data-to-Active-0227d15c
http://blogs.technet.com/b/heyscriptingguy/archive/2012/10/31/use-powershell-to-modify-existing-user-accounts-in-active-directory.aspx
http://powershell.org/wp/forums/topic/ad-import-csv-update-attributes-script/
Please mark this as answer if it helps -
Query on integrating windows file server into SAP KM using WEBDAV
hi
I have sucessfully integrated windows file server into SAP KM using WEBDAV. I have query in it regarding the possible validation against the portal Database user. Can we configure such that the user comparison happens for LDAP as well as database user. Have anyone configured such a scenario?
Regards,
Ganesh NHi Ganesh,
this should work in principle.
However you would need a user in Active Directory for each user in the portal database that should connect to the file server if you are using the SSO22KerbMap Module as I assume.
In my whitepaper I have mentioned this for the internal user index_service that does only exist in the portal database.
Best regards,
André -
How to save hr data in Active directory using abap
Hi all
can any one please help me out how to save hr data in Active directory using LDAP connector
please help as this is very urgent requirement
thanks in advance
Thanks
ChantiWhat form do you have the user's name in ?
ANTIPODES\alberteString searchFilter = "(&(objectClass=user)(samAccountName=alberte))";[email protected] searchFilter = "(&(objectClass=user)(userPrincipalName=[email protected]))";Albert EinsteinString searchFilter = (&(objectClass=user)(givenName=Albert)(sn=Einstein))";or using Ambiguous Name Resolution (anr)String searchFilter = "(&(objectClass=user)(anr=Albert Einstein))";or it's even clever enough to useString searchFilter = "(&(objectClass=user)(anr=Einstein Albert))"; -
How to store the flat file data into custom table?
Hi,
Iam working on inbound interface.Can any one tell me how to store the flat file data into custom table?what is the procedure?
Regards,
SujanHie
u can use function
F4_FILENAME
to pick the file from front-end or location.
then use function
WS_UPLOAD
to upload into
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
CALL FUNCTION 'F4_FILENAME' "Function to pick file
EXPORTING
field_name = 'p_file' "file
IMPORTING
file_name = p_file. "file
CALL FUNCTION 'WS_UPLOAD'
EXPORTING
filename = p_file1
TABLES
data_tab = it_line
*then loop at it_line splitting it into the fields of your custom table.
loop at it_line.
split itline at ',' into
itab-name
itab-surname.
endloop.
then u can insert the values into yo table from the itab work area.
regards
Isaac Prince -
Flat file data load - ODS - Look up data in Startroutine
Hi All,
There is a requirement that I have two ODS say 1) ABC and XYZ.
For Both ODS , We load flat file data.
First we load data to ABC for current fiscal period
During the data load to XYZ, We lookup part nos data in ABC for current fiscal period , if data there , for those part nos we load XYZ Flat file data.
My requirement is that , I need to fetch part nos data from ABC for Fiscal period < current fisacl period. Then I need to extract already loaded data for those part nos from XYZ and Mark the flag field as Non reportable .
How can we achieve this ?
Please advice
Thanks
AjayHi ,
Thanks for your reply.
I have done so. When I add lines of My Internal table data to DATA_PACKAGE, It gives the syntax error that both structures are not unique. We have several fields in XYZ ods but they are not in Communication structure.
Thats the problem. Also , if I write the code , will it be executed for each datapackage. I mean , for each datapacket process , my code will fetch whole data from ABC and then from XYZ. Repetion will be there?
Any other logic can i use ?
Thanks
Ajay -
BDC (Flat File Data Validation) - Code
I am trying to validate flat file data BEFORE performing BDC (Call Trans. or Session)..
Pls help me out in below code for xk02..
DATA: BEGIN OF itab occurs 0, "ITAB having flat file data.
lifnr(16) ,
bukrs(4),
ekorg(4),
END OF itab.
DATA: BEGIN OF int_final occurs 0,
lifnr(16) ,
bukrs(4),
ekorg(4),
status(6),
message(6),
END OF int_final.
DATA: int_final TYPE TABLE OF int_final.
DATA: wa_itab TYPE TABLE OF itab.
DATA: validate_itab TYPE TABLE OF itab. "VALIDATE_ITAB having master data.
DATA: wa_validate_itab TYPE TABLE OF itab.
FORM data_validation .
SELECT LFB1LIFNR LFB1BUKRS LFM1~EKORG INTO TABLE validate_itab
FROM LFB1 INNER JOIN LFM1 ON LFB1LIFNR = LFM1LIFNR.
IF sy-subrc = 0.
SORT validate_itab BY lifnr bukrs ekorg.
ENDIF.
LOOP AT itab INTO wa_itab.
READ TABLE validate_itab WITH KEY
lifnr = itab-lifnr
bukrs = itab-bukrs
ekorg = itab-ekorg
BINARY SEARCH.
IF sy-subrc NE 0.
PERFORM f_error_log USING text-005. "Invalid Value Set
CONTINUE.
ENDIF.
ENDLOOP.
ENDFORM. " data_validation
*& Form f_error_log
FORM f_error_log USING l_message TYPE string.
CLEAR : fs_final.
fs_final-lifnr = itab-lifnr.
fs_final-bukrs = itab-bukrs.
fs_final-ekorg = itab-ekorg.
fs_final-status = text-014. "Error
fs_final-message = l_message.
APPEND fs_final TO int_final.
ENDFORM. " f_error_log
Thanks..Hi GAurav,
I have a small question in th validation.
In LFM1~LIFNR does not contian any value how u r comparing both and one more thing After getting the data using GUI_upload u will get the data into validate_tab.
Loop at Vlidate_tab into wa_itab.
SELECT LFB1LIFNR LFB1BUKRS LFM1~EKORG INTO TABLE validate_itab
FROM LFB1 INNER JOIN LFM1 ON LFB1~LIFNR = wa_itab-lifnr.
endllop.
Thanks, -
Errors when loading flat file data
We just test to load a very simple flat file data with only two lines and the two lines of data in preview of InfoSource is correct. But when run InfoPackage to load data, the monitor of the InfoPackage shows the following errors (see in between two dashed lines below):
Error getting SID for ODS object ZDM_SUBS
Activation of data records from ODS object ZDM_SUBS terminated
Error when assigning SID (details in long text)
Error when assigning SID (details in long text)
Error when assigning SID (details in long text)
Error when assigning SID (details in long text)
Error when assigning SID (details in long text)
Error when assigning SID (details in long text)
Error when assigning SID (details in long text)
Error when assigning SID (details in long text)
Value 'Bottom' (hex. '0042006F00740074006F006D') of characteristic ZRATEPLN contains invalid characters
Value 'Dealer' (hex. '004400650061006C00650072') of characteristic ZCHANNEL contains invalid characters
Value 'Bottom' (hex. '0042006F00740074006F006D') of characteristic ZRATEPLN contains invalid characters
Value '19884/' of characteristic 0DATE is not a number with 000008 spaces
Value '/19812' of characteristic 0DATE is not a number with 000008 spaces
Value '19884/' of characteristic 0DATE is not a number with 000008 spaces
In the flat file (excel sheet saved as a CSV file), for each row of the data, there are two fields which are start_date and end_date and the date format is MM/DD/YYYY and in the Transfer Rule, we transfer the date format from MM/DD/YYYY to YYYYMMDD which is required by DATS InfoObject type in BW. If you need the excel sheet of data in order to answer our questions about the above errors, you can give us your e-mail address and we can send the simple two rows of data excel sheet file to you.
Thanks!Hi Kevin,
1.You can use lowercase letters in the values for your characteristics provided you have checked the lowercase checkbox in the general tab page of Create characteristic screen.But when you do so no masterdata tables,text tables, or another level of attributes underneath are allowed.
OR
Use only upper case letters in your characteristic unchecking the above mentioned box.
2.The date format in the CSV file should be yyyymmdd.It should have 8 characters . I guess there is something strange in your "calendardays" since I could not find 8 characters irrespective of the order.Do not forget to use zeroes.
Hope this works.
Reward if it is helpful.
Regards,
Balaji -
Need help in laoding flat file data, which has \r at the end of a string
Hi There,
Need help in loading flat file data, which has \r at the end of a string.
I have a flat file with three columns. In the data, at the end of second column it has \r. So because of this the control is going to the beginning of next line. And the rest of the line is loading into the next line.
Can someone pls help me to remove escape character \r from the data?
thanks,
ragHave you looked into the sed linux command? here are some details:
When working with txt files or with the shell in general it is sometimes necessary to replace certain chars in existing files. In that cases sed can come in handy:
1 sed -i 's/foo/bar/g' FILENAME
The -i option makes sure that the changes are saved in the new file – in case you are not sure that sed will work as you expect it you should use it without the option but provide an output filename. The s is for search, the foo is the pattern you are searching the file for, bar is the replacement string and the g flag makes sure that all hits on each line are replaced, not just the first one.
If you have to replace special characters like a dot or a comma, they have to be entered with a backslash to make clear that you mean the chars, not some control command:
1 sed -i 's/./,/g' *txt
Sed should be available on every standard installation of any distribution. At lesat on Fedora it is even required by core system parts like udev.
If this helps, mark as correct or helpful. -
Flat File Data Loads to BI 7.0
Hi Experts,
Please update me what is the best approach i have to follow for the below scenario of Flat File Data Loads
I will get data in Excel ....with Two worksheets....from the user
My requirment is to place the file in Central location avaliable to the user and BW to update if any changes necessary and want to load data(full0 to bw from file if there are any changes
Please update me how to deal with this scenarion of Two work sheets,In a central location...
ThanksEasiest thing would be to use a DSO with change log to handle the changes to pass onto to any cubes and load a full every night
Then let the change log worry about any changes to the workbook
You have to be careful about the DSO keys though for this to work properly
Now to automate the loads - just how are you planning to create the infopackage as it will only read a csv and not the binary xls
Well it will read the binary xls if you maybe use a dbconnect with a jdbc driver to read the xls (that's on my next thing to do - but if you are as your user id suggests a "bw learner" then that may be a bit complicated)
The only other thign to do is to write a macro that automatically creates the csv file on the app server when the user quits the xls
Or off course you can just dump the csv each night - but then that is a manual task and in systems I design I hate manual tasks as staff go on holiday and peopel change jobs and it's not really very SoX compliant -
Custom code for Flat file reconciliation on LDAP
Hello,
I have to write a custom code for flat file reconciliation on LDAP as the GTC connector wasn't working entirely.
Could someone help me out with this.. How do i do this ??
Thanksflat file reconciliation on LDAPWhat do you mean by Flat File on LDAP ?
If you want to create Flat File connector then search google for reading a flat file using Java.
Define RO Fields and do mapping in Process Defintion. You can use Xellerate User RO for Trusted Recon.
Make a map of CSV that and Recon Field
Call the Reconciliation API -
Converting Flat File data into XML
Hi Experts,
Consider the message type of the SENDER system and flat file data
<dt_sender>
<root>
<header1> 0..1
<f1>
<f2>
<f3>
<header2> 0..1
<f4>
<f5>
<f6>
<item> 1..unbounded
<f7>
<f8>
<f9>
<f10>
<f11>
<f12>
</item>
abc def ghi jkl mno pqr
123 123 123 123 123 123
456 456 456 456 456 456
how to convert the flat file data into following XML data. please note that each field value is separated by TAB delimeter...wht parameters shld b used
<root>
<Header1>
<f1>abc</f1>
<f2>def</f2>
<f3>ghi</f3>
</Header1>
<Header2>
<f4>jkl</f4>
<f5>mno</f5>
<f6>pqr</f6>
</Header1>
<item>
<f7>123</f7>
<f8>123</f8>
<f9>123</f9>
<f10>123</f10>
<f11>123</f11>
<f12>123</f12>
<f7>456</f7>
<f8>456</f8>
<f9>456</f9>
<f10>456</f10>
<f11>456</f11>
<f12>456</f12>
</item>
points will be given to the correct answers
Thanks in advance.
FAisal
Edited by: Abdul Faisal on Feb 29, 2008 5:53 AMFaisal,
When you read the multiple recordset strucutre file then each record in txt file should have an header from which you can identiy which segment it should go.. and you identiy it by using the keyfiledValue in file adapter
<root>
<header1> 0..1
<f1>
<f2>
<f3>
<header2> 0..1
<f4>
<f5>
<f6>
<item> 1..unbounded
<f7>
<f8>
<f9>
<f10>
<f11>
<f12>
</item>
for this input file
abc def ghi jkl mno pqr
123 123 123 123 123 123
456 456 456 456 456 456
abc def ghi can be read using the file adater to header 1 usinfg key field value, but using the same file adapter you cannt put GHI into header2.
else you should read whole row abc def ghi jkl mno pqr in single filed and write an UDF to split data to header1 and Header 2
similarly you have to take care for item records also
if your inout file is something like this
abc def ghi
jkl mno pqr
123 123 123 123 123 123
456 456 456 456 456 456
abc identifies to Header 1
JKL for Header 2 so on...
read the whole line in single field and write UDF to Split to header 1 and header 2 similary for item. -
Conversion_Exit_Cunit_error occured while loading the Flat file data
Hi
Iam tryign to load Flat file data into an ODS, i am getting error like Error Conversion Cunit.
Also we are using 0unit in the ODS for which CUNIT is a conversion rule
Can you please suggest me why iam getting this errorHi Sunil
Hope you can check whether you are loadig the flat file data from application server or Client workstation.
May be if you are loading from Client work station you will face problem of this type.
Try to check if any change in format in the file.
at the end of the file delete the spaces.
Maybe you are looking for
-
Problem description: Start up each morning (from sleep) takes about 15 minutes before computer runs well. Extremely sluggish. EtreCheck version: 2.0.11 (98) Report generated November 28, 2014 at 9:26:21 AM EST Hardware Information: ℹ️ iMac (20-inch
-
How do i fix my minamize on a mac from a square back to the regular one
how do i fix my minamize thing for my macbook from a little square back to the original one.
-
Using mac.mail with windows VISTA and outlook
I have a new Toshiba laptop with VISTA and am having problems sending links or messages when not in mac.mail. The setup with windows mail is asking me to provide the names of the "incoming server and "outgoing server which I am unable to identify or
-
SGD VDI vCenter problem accessing multiple desktops simultaneously
Hi, I am using vCenter as a desktop provider and have created multiple flexible desktop pools in VDI. These desktops pools are then assigned to users in SGD. When accessing these desktops simultaneously through SGD either SGD or VDI gets confused and
-
How can I upload a video from my PC to my iPad?