Data Loads via an Integration Interface
In our Demantra 7.3 environment, Open Orders, Pricing and Trade Spend all get loaded via Integration Interfaces (IIs). Those IIs get called after ep_load_main runs, and there is nothing in the workflow after those II calls.
So how does the data get transferred from the BIIO tables into sales_data and/or mdp_matrix? Since there is nothing in the workflow to do that, I would expect to see triggers on the BIIO tables, but there aren't any.
Hi,
Data is loaded from BIIO tables into Sales_Data/Mdp_matrix using a workflow step 'Transfer Step' in Demantra.
Transfer step takes the Import integration interface as input and load data from the corresponding BIIO tables into Demantra base tables.
Please check if there is a transfer step in the workflow after ep_load_main run.
Thanks,
Rohit
Similar Messages
-
Hi,
PLease let me know the process to control the EIS data load into essbase using a rules file. I did not find the option in EIS. please help me.
Thanks,
prIn EIS
1) You have to define the Logical OLAP Model connecting to the relational source.It defines the joins between fact table and dimension tables.
2) Based on the OLAP Model You have to create meta outline which defines the rules for loading members and data into essbase. -
Master Data load via DTP (Updating attribute section taking long time)
Hi all,
Iam loading to a Z infoobject. Its a master data load for attributes.Surprisingly, i could find that PSA pulls records very fastly( 2 minutes) but the DTP which updates the infoobject takes a lot of time. It runs into hours.
When observed the DTP execution monitor, which shows the breakup of time between extraction,filter,transformation,updation of attributes
i could observe that the last step "updation of attributes for infoobject" is taking lots of time.
The masterdata infoobject has got also two infoobjects compounded.
In transformation ,even they are mapped.
No of parallel processes for the DTP was set to 3 in our system.
Job Class being "C".
Can anyone think of what could be the reason.Hi,
Check the T code ST22 for any short dump while loading this master data. There must be some short dump occured.
There is also a chance that you are trying to load some invalid data (like ! character as a first character in the field) into the master.
Regards,
Yogesh. -
Data Loading via Infopakge Vs UCBATCH01
Hello Experts,
I am new to BCS. I need to load data in BCS cube.Currently we have a process chain that loads data in BCS (Data stream load) .
In that process chain it seems that is loaded using program UCBATCH01.
There are sometimes when user need to load data manually ( like some adjustment entries). We are thinking to automate this process. So I have created a process chain for it, which picks data file from appln server to load to BCS.
In this process chain I am thinking to use Infopackage which can load data to BCS using file from appn server.
I am unable to understand if I should use UCBATCH01 program or I can use above said Infopackage ?
Please advice.
Thank you,
Murtuza.Hello Experts,
I am new to BCS. I need to load data in BCS cube.Currently we have a process chain that loads data in BCS (Data stream load) .
In that process chain it seems that is loaded using program UCBATCH01.
There are sometimes when user need to load data manually ( like some adjustment entries). We are thinking to automate this process. So I have created a process chain for it, which picks data file from appln server to load to BCS.
In this process chain I am thinking to use Infopackage which can load data to BCS using file from appn server.
I am unable to understand if I should use UCBATCH01 program or I can use above said Infopackage ?
Please advice.
Thank you,
Murtuza. -
Decimal places lost after Account metadata load via ERP Integrator
Hi!
Does anybody faced something like this?
I'm losing my HFM app decimal places after loading Metadata from EPRi > Metadata.
EPM System 11.1.2.0
Oracle EBS 12
Thanks!
-Cheers!The issue is that you are mapping "Data Value" to amount in the Target Application import format:
SELECT SCENARIO "Scenario"
,YEAR "Year"
,PERIOD "Period"
,DATAVIEW "View"
,DATAVALUE "Value"
,ACCOUNT "Account"
,ENTITY "Entity"
,ICP "ICP"
,UD2 "Area"
,UD1 "Tail"
,UD3 "Special"
,UD4 "Facility"
,AMOUNT "DataValue"
FROM AIF_HS_BALANCES
WHERE LOADID = 300
You need to map AMOUNT to "AMOUNT" in the HFM Application. Check that the dimension mapping is correct for the class in the target application and that your import format is going to the proper target dimension(Amount). -
Dear All,
I have few doubts on ERP Integrator.
1) What are things required from Oracle GL to Planning for Data Loading using ERP Integrator? (Trail Balance is enough or we required some other file from Oracle GL)
2) Is there any scheduling options available for Data loading using ERP Integrator?
3) what is process for loading the data to Planning using ERP Integrator?
4) How we load the data to Planning? (i.e. monthly load, hourly load)
Anyone please guide me in this situation.
Thanks,
PC1) What are things required from Oracle GL to Planning for Data Loading using ERP Integrator? (Trail Balance is enough or we required some other file from Oracle GL)
Assuming you have the right version of Oracle EBS, ERP Integrator queries the tables within the Oracle EBS database to get the appropriate information. In my case, the trail balance file was enough. Within the trail balance file you will have the appropriate dimension intersection (account, entity, period, etc.), the type of account (asset vs. liability, etc.) and finally the dollar amount.
2) Is there any scheduling options available for Data loading using ERP Integrator?
Yes. You can use FDQM to map and validate the data, then use the FDQM batch scheduler to load the data via command line or you can use the FDQM batch scheduler as well.
3) what is process for loading the data to Planning using ERP Integrator?
I'll try to do my best to summarize. (Assuming you are using FDQM) Create rules in ERPi -> Configure the adapters in the Workbench Client for the ERPi Rules -> Configure the FDQM Web Client to call the Adapters set in the Workbench Client -> Import the data into FDQM. Then from here you can call your command line automation for batching if you wish.
4) How we load the data to Planning? (i.e. monthly load, hourly load)
This depends on your business. Assuming you are going to load the data for budget and planning purposes then maybe your business is happy with a monthly load (and most of the time this is the case). An hourly load might be helpful if you deal with users that need up to date actuals. Loading hourly acutals data might be an overkill for a budget or planning application, but I have ran into situations where this is needed, but then find myself worried about speeding up the calculations after the data is loaded. Long store short you can load monthly or hourly. -
I would like to transfer all data from my iPod classic to my new computer with windows 8.1. My old computer's cpu died. Utilizing iTunes which only allows iTunes albums purchased at iTunes store. The cd's were loaded via iTunes originally.
Install disk drive from old computer in an external enclosure.
Then copy the complete iTunes library from the disk drive to the disk drive in the new computer. -
Data loaded to Power Pivot via Power Query is not yet supported in SSAS Tabular Cube
Hello, I'm trying to create a SSAS Tabular cube from a data loaded to Power Pivot via Power Query (SAP BOBJ connector) but looks like is not yet supported.
Any one tried this before? any workaround that make sense?
The final goal is pull data from SAP BW, BO Universe (using PowerQuery) and be able to create a SSAS Tabular cube.
Thanks in advance
SebastianSebastian,
Depending on the size of the data from Analysis Services, one work around could be to import the data into into Excel and then make an Excel table and then use the Excel table as a data source.
Reeves
Denver, CO -
"UNICODE_IN_DATA" error in ODI 11.1.1.5 data load interface
Hello!
I am sorry, I have again to ask for help with the new issue with ODI 11.1.1.5. This is a multiple-column data load interface. I am loading data from tab-delimited text file into Essbase ASO 11.1.2. The ODI repository database is MS SQL Server. In the target datastore some fields are not mapped to the source but hardcoded with a fixed value, for example, since only budget data is always loaded by default, the mapping for "Scenario" field in the target has an input string 'Budget'. This data load interface has no rules file.
At "Prepare for loading" step the following error is produced:
org.apache.bsf.BSFException: exception from Jython:
Traceback (most recent call last):
File "<string>", line 86, in <module>
AttributeError: type object 'com.hyperion.odi.common.ODIConstants' has no attribute 'UNICODE_IN_DATA'
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
at java.lang.Thread.run(Thread.java:662)
I will be very grateful for any hintsHave you changes any of the Hyperion Java Files?
I have not seen this exact error before but errors like this when the KM is not in sync with the Java Files.
Also I always suggest using a rules file.
If you have changed the files, revert back to the original odihapp_common.jar and see if it works, if you have changed the files to get round the issues I described in the blog you should be alright just to have changed odihapp_essbase.jar
This is the problem now with Oracle and all there different versions and patches of ODI, it seems to me they have put effort into the 10.1.3.x Hyperion modules and then in 11.1.1.5 just given up and totally messed a lot of things up.
I hope somebody from Oracle read this because they need to get there act together.
Cheers
John
http://john-goodwin.blogspot.com/ -
ERP integrator Tasks - Data Load Rule
Hi
We are facing issues with ERP integrator 11.1.2.2.
We don't have some of the options available in in ERP integrator Data Load Rule Task.
Earlier we have the Add option in Data Load Rule under Source Filter -- Segment Values and all ERP integrator task screens are different as earlier.
Please find attached Screen shots of Current and Earlier.
Actually we have applied path for HFM .
Now its showing
HFM version --11.1.2.2.303.3959
ERP integrator - 11.1.2.2.0.0
After applied path this issue is coming.
Please guide us what we have to do to resolve this issue.
Edited by: Nagarjuna Reddy on May 22, 2013 12:31 PMNo i did not apply any patch for ERP integrator.
But we have applied path for HFM..Is it effecting to ERP integrator?? -
ODI - How to clear a slice before executing the data load interface
Hi everyone,
I am using ODI 10.1.3.6 to load data daily into an ASO cube (version:11.1.2.1). Before loading data for a particular date, I want the region to be cleared in the ASO cube defined by "that date".
I suppose I need to run a PRE_LOAD_MAXL_SCRIPT that clears the area defined by an MDX function. But I don't know how I can automatically define the region by looking at several coloums in the data source.
Thanks a lot.Hi, thank you for the response.
I know how to clear a region in ASO database. I wrote a MaxL like the following:
alter database App.Db clear data in region '{([DAY].[Day_01],[MONTH].[Month_01],[YEAR].[2011])}'
physical;
I have 3 seperate dimensions such as DAY, MONTH and YEAR. My question was, I don't know how I can automize the clearing process before each data load for a particular date.
Can I somehow automatically set the Day, Month, Year information in the MDX function by looking at the day,month,year coloumns in the relational data source. For example if I am loading data for 03.01.2011, I want my MDX function to become {([DAY].[Day_01],[MONTH].[Month_03],[YEAR].[2011])}'. In the data source table I also have seperate coloumns for Day, Month , Year which should make it easier I guess.
I also thought of using Substitution variables to define the region, but then again the variables need to be set according to the day,month, year coloums in the data source table. I also would like to mention that the data source table is truncated and loaded daily, so there can't be more than one day or one month etc in the table.
I don't know if I could clearly stated my problem, please let me know if there are any confusing bits.
Thanks a lot. -
Need suggestions for imporving data load performance via SQL Loader
Hi,
Our requirement is to load 512 (1 GB each) files in Oracle database.
We are using SQL loaders to load files into the DB (A partitioned table) and have tried almost all the possible options that come with sql loaders (Direct load path, parallel=true, multithreading=true, unrecoverable)
As the tables is growing bigger in size, each file load time is increasing (It started with 5 minutes per file and has reached 2 hours per 3 files now and is increasing with every batch- Note we are loading 3 files concurrently on the target table using the parallel = true oprion of sql loader)
Questions 1:
My problem is that somehow multithreading is not working for us (we have multi CPU server and have enabled multithreading=true). Could it be something to do with DB setting which might be hindering the data load to be done in multiple threads?
Question 2:
Would gathering stats on the target table and it's partitions help improve load performance ? I'm not sure if stats improve DML's, they would definitely improve sql queries. Any thoughts?
Question 3:
What would be the best strategy to gather stats on this table (which would end up having 512 GB data) ?
Question 4:
Do you think insertions in a partitioned table (with growing sizes) would have poor performance as compared to a non-partitioned table ?
Any other suggestions to improve performace are most welcome !!
Thanks,
Sachin
Edited by: Sachin Tiwari on Mar 13, 2013 6:29 AM2 hours to load just 3 GB of data seems unreasonable regardless of the SQL Loader settings. It seems likely to me that the problem is not with SQL Loader but somewhere else.
Have you generated a Statspack/ AWR/ ASH report to see where all that time is being spent? Are there triggers on the table? Are there bitmap indexes?
Is your table partitioned in a way that is designed to improve the efficiency of loads so that all the data from one file goes into one partition? Or is data from each file getting inserted into many different partitions.
Justin -
How to populate Values in Value Set via API or Interface
Dear frds:
i need to know is there any API or interface available to load thousands of values in particular valueset via API or Interface. I know Dataload is the alternative but i don't want to use that as the data is too much so its not feasible.
waiting for your response
Thanks.There is no API for fnd_flex_values.
You will have to do a direct table insert (or call FND_FLEX_VALUES_PKG which does nothing but a direct table insert).
Sandeep Gandhi -
CALL_FUNCTION_CONFLICT_TYPE Standard Data loading
Hi,
I am facing a data loading problem using Business content on CPS_DATE infocube (0PS_DAT_MLS datasource).
The R/3 extraction processes without any error, but the problem occurs in the update rules while updating the milestone date. Please find hereunder the log from the ST22.
The real weird thing is that the process works perfectly in development environment and not in integration one (the patch levels are strongly the same: BW 3.5 Patch #16).
I apologise for the long message below... this is a part of the system log.
For information the routine_0004 is a standard one.
Thanks a lot in advanced!
Cheers.
CALL_FUNCTION_CONFLICT_TYPE
Except. CX_SY_DYN_CALL_ILLEGAL_TYPE
Symptoms. Type conflict when calling a function module
Causes Error in ABAP application program.
The current ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" had to be terminated because one of the statements could not be executed.
This is probably due to an error in the ABAP program.
A function module was called incorrectly.
Errors analysis
An exception occurred. This exception is dealt with in more detail below
. The exception, which is assigned to the class 'CX_SY_DYN_CALL_ILLEGAL_TYPE', was neither caught nor passed along using a RAISING clause, in the procedure "ROUTINE_0004"
"(FORM)" .
Since the caller of the procedure could not have expected this exception
to occur, the running program was terminated. The reason for the exception is:
The call to the function module "RS_BCT_TIMCONV_PS_CONV" is incorrect:
The function module interface allows you to specify only fields of a particular type under "E_FISCPER".
The field "RESULT" specified here is a different field type.
How to correct the error.
You may able to find an interim solution to the problem in the SAP note system. If you have access to the note system yourself, use the following search criteria:
"CALL_FUNCTION_CONFLICT_TYPE" CX_SY_DYN_CALL_ILLEGAL_TYPEC
"GP420EQ35FHFOCVEBCR6RWPVQBR" or "GP420EQ35FHFOCVEBCR6RWPVQBR"
"ROUTINE_0004"
If you cannot solve the problem yourself and you wish to send
an error message to SAP, include the following documents:
1. A printout of the problem description (short dump)
To obtain this, select in the current display "System->List->
Save->Local File (unconverted)". 2. A suitable printout of the system log To obtain this, call the system log through transaction SM21. Limit the time interval to 10 minutes before and 5 minutes after the short dump. In the display, then select the function
"System->List->Save->Local File (unconverted)".
3. If the programs are your own programs or modified SAP programs, supply the source code.
To do this, select the Editor function "Further Utilities-> Upload/Download->Download".
4. Details regarding the conditions under which the error occurred
or which actions and input led to the error.
The exception must either be prevented, caught within the procedure
"ROUTINE_0004"
"(FORM)", or declared in the procedure's RAISING clause.
To prevent the exception, note the following:
Environment system SAP Release.............. "640"
Operating system......... "SunOS" Release.................. "5.9"
Hardware type............ "sun4u"
Character length......... 8 Bits
Pointer length........... 64 Bits
Work process number...... 2
Short dump setting....... "full"
Database type............ "ORACLE"
Database name............ "BWI"
Database owner........... "SAPTB1"
Character set............ "fr"
SAP kernel............... "640"
Created on............... "Jan 15 2006 21:42:36" Created in............... "SunOS 5.8 Generic_108528-16 sun4u"
Database version......... "OCI_920 "
Patch level.............. "109"
Patch text............... " "
Supported environment....
Database................. "ORACLE 9.2.0.., ORACLE 10.1.0.., ORACLE 10.2.0.."
SAP database version..... "640"
Operating system......... "SunOS 5.8, SunOS 5.9, SunOS 5.10"
SAP Release.............. "640"
The termination occurred in the ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" in
"ROUTINE_0004".
The main program was "RSMO1_RSM2 ".
The termination occurred in line 702 of the source code of the (Include)
program "GP420EQ35FHFOCVEBCR6RWPVQBR"
of the source code of program "GP420EQ35FHFOCVEBCR6RWPVQBR" (when calling the editor 7020).
Processing was terminated because the exception "CX_SY_DYN_CALL_ILLEGAL_TYPE" occurred in the procedure "ROUTINE_0004" "(FORM)" but was not handled locally, not declared in the RAISING clause of the procedure.
The procedure is in the program "GP420EQ35FHFOCVEBCR6RWPVQBR ". Its source code starts in line 685 of the (Include) program "GP420EQ35FHFOCVEBCR6RWPVQBR ".
672 'ROUTINE_0003' g_s_is-recno
673 rs_c_false rs_c_false g_s_is-recno
674 changing c_abort.
675 catch cx_foev_error_in_function.
676 perform error_message using 'RSAU' 'E' '510'
677 'ROUTINE_0003' g_s_is-recno
678 rs_c_false rs_c_false g_s_is-recno
679 changing c_abort.
680 endtry.
681 endform.
682 ************************************************************************
683 * routine no.: 0004
684 ************************************************************************
685 form routine_0004
686 changing
687 result type g_s_hashed_cube-FISCPER3
688 returncode like sy-subrc
689 c_t_idocstate type rsarr_t_idocstate
690 c_subrc like sy-subrc
691 c_abort like sy-subrc. "#EC *
692 data:
693 l_t_rsmondata like rsmonview occurs 0 with header line. "#EC *
694
695 try.
696 * init
variables
697 move-corresponding g_s_is to comm_structure.
698
699 * fill the internal table "MONITOR", to make monitor entries
700
701 * result value of the routine
>>>> CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
703 EXPORTING
704 I_TIMNM_FROM = '0CALDAY'
705 I_TIMNM_TO = '0FISCPER'
706 I_TIMVL = COMM_STRUCTURE-CALDAY
707 I_FISCVARNT = gd_fiscvarnt
708 IMPORTING
709 E_FISCPER = RESULT.
710 * if the returncode is not equal zero, the result will not be updated
711 RETURNCODE = 0.
712 * if abort is not equal zero, the update process will be canceled
713 ABORT = 0.
714
715 catch cx_sy_conversion_error
716 cx_sy_arithmetic_error.
717 perform error_message using 'RSAU' 'E' '507'
718 'ROUTINE_0004' g_s_is-recno
719 rs_c_false rs_c_false g_s_is-recno
720 changing c_abort.
721 catch cx_foev_error_in_function.
System zones content
Name Val.
SY-SUBRC 0
SY-INDEX 2
SY-TABIX 0
SY-DBCNT 0
SY-FDPOS 65
SY-LSIND 0
SY-PAGNO 0
SY-LINNO 1
SY-COLNO 1
SY-PFKEY 0400
SY-UCOMM OK
SY-TITLE Moniteur - Atelier d'administration
SY-MSGTY E
SY-MSGID RSAU
SY-MSGNO 583
SY-MSGV1 BATVC 0000000000
SY-MSGV2 0PROJECT
SY-MSGV3
SY-MSGV4
Selected variables
Nº 23 Tpe FORM
Name ROUTINE_0004
GD_FISCVARNT
22
00 RS_C_INFO I
4
9
COMM_STRUCTURE-CALDAY
20060303
33333333
20060303
SYST-REPID GP420EQ35FHFOCVEBCR6RWPVQBR 4533345334444454445355555452222222222222 704205135686F365232627061220000000000000
RESULT
000
333
00You have an update routine in which youar callin FM 'RS_BCT_TIMCONV_PS_CONV'. Parameter e_fiscper must be the same that type of the variable you use (you can see the data tyoe in FM definition, transaction se37). You should do somethin like the following.
DATA: var type <the same that e_fiscper in FM definition>
CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
EXPORTING
I_TIMNM_FROM = '0CALDAY'
I_TIMNM_TO = '0FISCPER'
I_TIMVL = COMM_STRUCTURE-CALDAY
I_FISCVARNT = gd_fiscvarnt
IMPORTING
E_FISCPER = var.
result = var.
--- ASSIGN POINTS IS USEFUL. -
Hi,
We need to load Data for around 6,00,000 records in Oracle Apps R12.1.1.
Please let me know the best practices for ensuring Data Load happens at the earliest.
Regards,
V NFor such large volume, you should use Oracle API / interface tables.
Check the following to see what is available.
Note: 462586.1 - Where are the Oracle® Release 12 (R12) API Reference Guide?
https://metalink.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=462586.1
Note: 458225.1 - Release 12 Integration Repository
https://metalink.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=458225.1
Note: 396116.1 - Oracle Integration Repository Documentation Resources Release 12
https://metalink.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=396116.1
Hope this helps,
Sandeep Gandhi
Maybe you are looking for
-
I see this board is very young. Well I'm a high school freshman in a java programming class and i have a little problem with my homework. My source code is: import java.awt.*; import java.awt.event.*; import javax.swing.*; import java.text.DecimalFor
-
Delaying a function so ADF will get all the selected rows and run only 1 time ?
Hello, I'm using JDeveloper 12.1.2.0.0. I have a group of functions that are executed after i click on one or more rows, which is called: public void onDimProjectSelect(SelectionEvent selectionEvent) Whenever i shift-click some rows (lets say, for th
-
[Solved]Can I install arch without a swap partition ?
I only have one primary partition free to install arch, with 3 primary partition and 1 extend partition, I have no other choice except install without swap partition. Is there any suggestion? I will try it until I get a U storage to write .img in. La
-
Access the photo with ArchiveLink to include in a smartform
We are trying to create an employee badge process within SAP. We have utilized ArchiveLink to store the employee photo, but we still need to find a way to access the photo and print it on a Smart Form. Has anyone had any experience with printing an i
-
Can Elements Save Directly to LR3?
I'm temporarily using Photoshop Elements 9 to make a few tweaks of images that I process in Lightroom 3.4. However, I've noticed that unlike CS4, the edited file does not automatcially close and return to LR when the file is closed. Instead, I have t