Running an import process
I realise that I may need to get more details for a different list but to start with I thought I would ask a more general community.
I have a retail POS application that inserts transaction rows to a staging table. I then need an internal database process to import these transactions via some sort of PL/SQL or Java SP process. My question is, what would be the best way of having such a process running in the background (similar to DBMS_JOB) and polling or being woken up by DBMS_ALERT to process a new row when it arrives?
Thanks for any thoughts
I'm not sure why you need a staging table at all. Why not have the POS application call a stored procedure that does whatever processing is necessary?
Assuming you do need a staging table, whether to use DBMS_JOB or DBMS_ALERT really depends on the requirements. DBMS_JOB would be ideal if you want to periodically process rows in batches, DBMS_ALERT if you want to process rows one at a time (in which case the whole staging table idea seems pointless) or based on some state being reached in the POS application (i.e. closing out the register).
Justin
Similar Messages
-
Query on long running client import process
Hi Gurus,
I have few queries regarding the parameter PHYS_MEMSIZE. Let me brief about the SAP server configuration before I get into the actual problem.
We are running on ECC 6.0 on windows 2003, SP2 - 64 bit and DB is SQL 2005 with 16GB RAM and page file configured 22 GB.
As per the Zero Administration Memory management I have learnt as a rule of thumb, use SAP/DB = 70/30 and set the parameter PHYS_MEMSIZE to approximately 70% of the installed main memory. Please suggest should I change the parameter as described in zero administration memory guide? If so what are the precautions we should take care for changing memory parameters. Are there any major dependencies and known issues associated to this parameter?
Current PHYS_MEMSIZE parameter set to 512 MB.
Few days ago we had to perform the client copy using EXPORT / IMPORT method. Export process was normal and went well However the import process took almost 15 HRs to complete. Any clues what could be the possible reasons for long running client copy activity in SQL environment. I am suspecting the parameter PHY_MEMSIZE was configured to 512 MB which appears to be very low.
Please share your ideas and suggestions on this incase if anyone have ever experienced this sort of issue because we are going to perform a client copy again in next 10 days so i really need your inputs on this.
Thanks & Regards,
Vinod
Edited by: vinod kumar on Dec 5, 2009 9:24 AMHi Nagendra,
Thanks for your quick response.
Our production environment is running on ACtive/Active clustering like One central Instance and Dialog Instance. Database size is 116 GB with 1 data file and log file is 4.5 Gb which are shared in cluster.
As suggested by you if I need to modify the PHYS_MEMSIZE to 11 or 12 GB(70% of physical RAM). What are the precautions should I consider and I see there are many dependencies associated with this parameter as per the documentation of this parameter.
The standard values of the following parameters are calculated
According to PHYS_MEMSIZE
em/initial_size_MB = PHYS_MEMSIZE (extension by PHYS_MEMSIZE / 2)
rdisp/ROLL_SHM
rdisp/ROLL_MAXFS
rdisp/PG_SHM
rdisp/PG_MAXFS
Should I make the changes to both Central and dialog instance as well. Please clarify me,. Also are there any other parameters should i enhance or adjust to speedup the client copy process.
Many Thanks...
Thanks & Regards,
Vinod -
Upgrading Stellent 7.5 to OCS 10gR3 Import Process failing HELP NEEDED
Hi,
I am upgrading Stellent 7.5 to Oracle COntent Server 10gR3. Here is what I have done.
1. Migrated all the configuration from Stellent to 10gR3
2. Migrated the Folders from Stellent to 10gR3
3. Migrated the content by creating an Archive and then importing the Archive in 10gR3.
I am seeing lot of errors in the log file. Following are the errors I see in the log file.
1.
Could not send mail message from (null) with subject line: Content Release Notification. Could not get I/O for connection to: hpmail.rtp.ppdi.com java.net.ConnectException: Connection timed out
2.
Import error for archive 'ProductionContent' in collection 'prod_idc': Invalid Metadata for 'ID_000025'. Virtual folder does not exist.
3.
Import error for archive 'ProductionContent' in collection 'prod_idc': Content item 'ID_004118' was not successfully checked in. The primary file does not exist.
4.
Import error for archive 'ProductionContent' in collection 'prod_idc': Content item 'ID_004213' was not successfully checked in. IOException (System Error: /u01/app/oracle/prod/ucm/server/archives/productioncontent/09-dec-21_23.29.44_396/4/vault/dmc_unblinded_documents/4227 (No such file or directory)) java.io.FileNotFoundException: /u01/app/oracle/prod/ucm/server/archives/productioncontent/09-dec-21_23.29.44_396/4/vault/dmc_unblinded_documents/4227
5.
Import error for archive 'ProductionContent' in collection 'prod_idc': Content item 'ID_031414' with revision label '2' was not successfully checked in. The release date (11/4/08 9:12 AM) of the new revision is not later than the release date (11/4/08 9:12 AM) of the latest revision in the system.
6.
Import error for archive 'ProductionContent' in collection 'prod_idc': Invalid Metadata for 'ID_033551'. Item with name '07-0040_IC_Olive-View_UCLA_ERI_Cellulitis_2008-08-26.pdf' already exists in folder '/Contribution Folders/2007/07-0040/07-0040Site_Specific_Documents/07-0040Olive_View_UCLA_Medical_Center/07-0040Archive/07-0040Essential_Documents_ARC/07-0040Informed_Consent_ARC/'.
7.
Import error for archive 'ProductionContent' in collection 'prod_idc': Aborting. Too many errors.
QUESTIONS:
Is there a way to keep the import processing running even if the errors are coming. As it looks like when there are too many errors the import process stops in the middle.
How do I find out the total number of folders and the documents. As I want to run the same query on Stellent 7.5 and find out total number of folders and the documents and then run the same query on 10gR3 and compare the results. Just want to fnd out how much content is imported.
How do I run the import process over again as half of the content is imported and my import process failed in the middle when running the process over again what settings do I need to provide to make sure no duplicates get created etc.
Any help is really appreciated.
ThanksHi
There are a couple of ways to get around the issues that you are facing such that import process is not interrupted because of these. They are as follows :
1. Use ArchiveReplicationException component . This will keep the import process running and make a log of the failed process which can be used for assessing / gauging the success of the import and what all needs to be redone.
I would suggest this as the option for your case.
2. Put the config variable for arciver exception to 9999 so that the archive process stops only after hitting the limit of 9999 errors.
I would suggest to go for step 1 as that would be a much more foolproof and methodical way of knowing what all items have failed during import.
Thanks
Srinath -
Handling rejections in the Payable's Supplier Open Interface Import Process
I’m using the suppliers API to mass load the suppliers. I’m loading the tables AP_SUPPLIERS_INT, AP_SUPPLIER_SITES_INT and AP_SUP_SITE_CONTACT_INT by part; first the AP_SUPPLIERS_INT and then the other two tables. Due to various errors I get some rejections on the first table. If I want to correct the data on the interface tables, what should I do (on the rejected records that I want to correct)? (a)Should I correct the data and leave the STATUS and REJECTION_CODE as the API left them and re-run the Open Interface Import process? (b)Should I delete the contents of those fields? (c)Should I delete the entire table?
I tried option (a) but the process seemed to take forever compared to the first time I ran it and I canceled the request.
Thanks in advance.Hi,
Unhide the Debug (Debug Switch) parameter of the Report, Supplier Open Interface Import and run the program with Debug flag as Yes.
Please post the log to help us understand the issue.
Regards,
Sridhar -
Getting while running the BPEL process from java
Hi All,
We are using the following java code to run the BPM process.
package callBPMProcess;
import java.util.Hashtable;
import java.util.UUID;
import java.util.List;
import javax.naming.Context;
import oracle.soa.management.facade.Locator;
import oracle.soa.management.facade.LocatorFactory;
import oracle.soa.management.facade.Composite;
import oracle.soa.management.facade.Service;
import oracle.soa.management.facade.CompositeInstance;
import oracle.soa.management.facade.ComponentInstance;
import oracle.fabric.common.NormalizedMessage;
import oracle.fabric.common.NormalizedMessageImpl;
import oracle.soa.management.util.CompositeInstanceFilter;
import oracle.soa.management.util.ComponentInstanceFilter;
import java.util.Map;
import javax.xml.transform.*;
import javax.xml.transform.dom.*;
import javax.xml.transform.stream.*;
import org.w3c.dom.Element;
import java.io.*;
public class StartProcess {
public StartProcess() {
super();
Hashtable jndiProps = new Hashtable();
jndiProps.put(Context.PROVIDER_URL, "http://ytytry.4234434.com:7001/soa-infra");
jndiProps.put(Context.INITIAL_CONTEXT_FACTORY,"weblogic.jndi.WLInitialContextFactory");
jndiProps.put(Context.SECURITY_PRINCIPAL, "weblogic");
jndiProps.put(Context.SECURITY_CREDENTIALS, "funnyj0ke");
jndiProps.put("dedicated.connection", "true");
String inputPayload =
"<process xmlns=\"http://xmlns.oracle.com/HelloWorld/Helloworld/BPELProcess1\">\n" +
" <input>hello</input>\n" +
"</process>\n" ;
Locator locator = null;
try {
// connect to the soa server
locator = LocatorFactory.createLocator(jndiProps);
String compositeDN = "default/Helloworld!1.0";
// find composite
Composite composite = locator.lookupComposite("default/Helloworld!1.0");
System.out.println("Got Composite : "+ composite.toString());
// find exposed service of the composite
Service service = composite.getService("bpelprocess1_client_ep2");
System.out.println("Got serviceName : "+ service.toString());
// make the input request and add this to a operation of the service
NormalizedMessage input = new NormalizedMessageImpl();
String uuid = "uuid:" + UUID.randomUUID();
input.addProperty(NormalizedMessage.PROPERTY_CONVERSATION_ID,uuid);
// payload is the partname of the process operation
input.getPayload().put("payload",inputPayload);
// process is the operation of the employee service
NormalizedMessage res = null;
try {
res = service.request("process", input);
} catch(Exception e) {
e.printStackTrace();
Map payload = res.getPayload();
Element element = (Element)payload.get("payload");
TransformerFactory tFactory = TransformerFactory.newInstance();
Transformer transformer = tFactory.newTransformer();
transformer.setOutputProperty("indent", "yes");
StringWriter sw = new StringWriter();
StreamResult result = new StreamResult(sw);
DOMSource source = new DOMSource(element);
transformer.transform(source, result);
System.out.println("Result\n"+sw.toString());
System.out.println("instances");
CompositeInstanceFilter filter = new CompositeInstanceFilter();
filter.setMinCreationDate(new java.util.Date((System.currentTimeMillis() - 2000000)));
// get composite instances by filter ..
List<CompositeInstance> obInstances = composite.getInstances(filter);
// for each of the returned composite instances..
for (CompositeInstance instance : obInstances) {
System.out.println(" DN: " + instance.getCompositeDN() +
" Instance: " + instance.getId() +
" creation-date: " + instance.getCreationDate() +
" state (" + instance.getState() + "): " + getStateAsString(instance.getState())
// setup a component filter
ComponentInstanceFilter cInstanceFilter = new ComponentInstanceFilter();
// get child component instances ..
List<ComponentInstance> childComponentInstances = instance.getChildComponentInstances(cInstanceFilter);
// for each child component instance (e.g. a bpel process)
for (ComponentInstance cInstance : childComponentInstances) {
System.out.println(" -> componentinstance: " + cInstance.getComponentName() +
" type: " + cInstance.getServiceEngine().getEngineType() +
" state: " +getStateAsString(cInstance.getState())
System.out.println("State: "+cInstance.getNormalizedStateAsString() );
} catch (Exception e) {
e.printStackTrace();
private String getStateAsString(int state)
// note that this is dependent on wheter the composite state is captured or not
if (state == CompositeInstance.STATE_COMPLETED_SUCCESSFULLY)
return ("success");
else if (state == CompositeInstance.STATE_FAULTED)
return ("faulted");
else if (state == CompositeInstance.STATE_RECOVERY_REQUIRED)
return ("recovery required");
else if (state == CompositeInstance.STATE_RUNNING)
return ("running");
else if (state == CompositeInstance.STATE_STALE)
return ("stale");
else
return ("unknown");
public static void main(String[] args) {
StartProcess startUnitProcess = new StartProcess();
But we getting the fallowing error.Can some body help out us.
SEVERE: Failed to create a DirectConnectionFactory instance (oracle.soa.api.JNDIDirectConnectionFactory): oracle.soa.api.JNDIDirectConnectionFactory
javax.naming.NoInitialContextException: Cannot instantiate class: weblogic.jndi.WLInitialContextFactory [Root exception is java.lang.ClassNotFoundException: weblogic.jndi.WLInitialContextFactory]
at javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:657)
at javax.naming.InitialContext.getDefaultInitCtx(InitialContext.java:288)
at javax.naming.InitialContext.init(InitialContext.java:223)
at javax.naming.InitialContext.<init>(InitialContext.java:197)
at oracle.soa.management.internal.ejb.EJBLocatorImpl.<init>(EJBLocatorImpl.java:166)
at oracle.soa.management.facade.LocatorFactory.createLocator(LocatorFactory.java:35)
at callBPMProcess.StartProcess.<init>(StartProcess.java:53)
at callBPMProcess.StartProcess.main(StartProcess.java:152)
Caused by: java.lang.ClassNotFoundException: weblogic.jndi.WLInitialContextFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at com.sun.naming.internal.VersionHelper12.loadClass(VersionHelper12.java:46)
at javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:654)
... 7 more
Process exited with exit code 0.
Thanks in advanced,
Narasimha.
Edited by: parker on Mar 27, 2011 11:55 PMLooks like you don't have WebLogic classes on the classpath:
javax.naming.NoInitialContextException: Cannot instantiate class: weblogic.jndi.WLInitialContextFactory [Root exception is java.lang.ClassNotFoundException: weblogic.jndi.WLInitialContextFactory]
Other options for creating an instance are to use a web service call or one of the other adapters (e.g. JMS). If you need to directly start a process you might also look at this blog from the ATeam:
http://redstack.wordpress.com/worklist/
The specific example for getting the tasks is at: http://redstack.wordpress.com/2011/03/09/implementing-task-initiation/
Then the real work of creating an instance from the initiator tasks is at: http://redstack.wordpress.com/2011/03/09/creating-the-domain-layer-for-the-worklist/ -
We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
exported using the export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites.
For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
to CAB and extracting it, the only files it contains are DAT files.
The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
This leads to two questions:
1.
Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
contains no XML files.
If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
Thanks in advance
Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.comI wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
ever happen to find a solution to this problem? -
MediaCore Importer Process Debug Event
I have a Xp 64bit computer running CS4. I keep on getting MediaCore Importer Process Debug Event Errors on differnt projects. The error details are sometimes different but usually they say. Src/importer/ImporterInstance.cpp.-641
I have scanned for viruses
I have repaired windows
I have repaired premier
I have deleted and reinstalled premier
I have updated the drivers on my video card
I have looked around here and haven't found much for help. It might be a specific file given me problems but I have not tracked it down yet. It might be a codec problem. I am running out of ideasMy computer was working fine until 2-3 weeks ago. No changes except for maybe a CS4 update. Scanned for viruses nothing but a few cookies.
Supermicro board
dual 2.6 ghz intel quad xeons
8 gigs of Wintec Ram
Nvidia Quadro (dual monitor)
Alesis audio card
Adaptec raid controller
Seagate hard drives (about 2 tb raid 10)
Pannasonic DVCpro Deck (firewire)
Windows XP 64 (just repaired - trying to fix problems)
It is connected to a Cisco Switch for file sharing.
Updated CS4 PP
Most video footage is from gyhd100u but we get footage and video from many sources
I am looking for a post from someone who has had this problem, fixed this problem, or has some insight to what a MediaCore Importer Process Debug Event means.
I have been using Adobe Products for over 10 years and I am very computer savy. Even if you are looking to fix the same problem send me some info and maybe I can narrow it down and fix it for both of us.
Right now I think it might be my video card. When I repaired windows I upgraded the drivers on it now I have even more problems. I am rolling them back right now. If that doesn't work I will rollback premier -
Hi All,
I have successfully completed the journal import process .
Now, If I want to run this Journal Import process using FND_SUBMIT.SUBMIT_REQUEST wrote in procedure then what parameters I have to pass ?
Request to guide me
Thanks
SanjaySanjay
You need to insert rows into gl_interface_control first using:
gl_journal_import_pkg.populate_interface_control (user_je_source_name => p_je_source_name,
GROUP_ID => p_group_id,
set_of_books_id => p_ledger_id,
interface_run_id => p_interface_run_id,
table_name => p_table_name,
processed_data_action => p_action
And then call Journal import using:
fnd_request.submit_request (application => 'SQLGL', -- application short name
program => 'GLLEZL', -- program short name
description => NULL, -- program name
start_time => NULL, -- start date
sub_request => FALSE, -- sub-request
argument1 => p_interface_run_id, -- interface run id
argument2 => 1, -- set of books id
argument3 => 'N', -- error to suspense flag
argument4 => NULL, -- from accounting date
argument5 => NULL, -- to accounting date
argument6 => l_summary_flag, -- create summary flag
argument7 => 'N', -- import desc flex flag
argument8 => 'Y' -- Data security mode flag
Thanks
Nagamohan -
Separate Distribution Monitor Export and Import Processes on Multiple Machines
Hi,
Would you kindly let me know whether it is possible (means officially supported way) to run Distribution Monitor Export and Import processes on different machines?
As per SAP note 0001595840 "Using DISTMON and MIGMON on source and target systems", it says as below.
> 1. DISTMON expects the export and import to be carried out on the same server
I think it means that export and import processes for the same tables must be run on the same machine, is it correct? If yes, Export on the machine A, and then Import those exported data on the other machine B is not the officially supported way... (However, I know it is technically possible.)
Kind regards,
YutakaHi Yutaka,
Point no. 2 & 3 clarify the confusion. However let me explain it briefly:
Distribution Monitor is used basically in case of migration of large SAP systems (database). It provides the feature to increase parallelism of export and import, distributing the process across available systems.
You have to prepare the system for using DistMon. A common directory needs to be created as"commDir" and in case you use multiple systems for executing more number of processes of export and import then that "commDir" should be shared across all those systems. And this is what the Point no.1 in KBA 1595840 mentions about. Distribution Monitor will run both the export and import process from the machine which is prepared for using DistMon and DistMon itself will control the other processes i.e. MigMon. No need to start separate MigMon.
For example: You are performing a migration of SAP system based on OS:AIX and DB:DB2 to OS: HP-UX and DB: Oracle. You need to perform the export using DistMon and you are having 4 Windows servers which can be used for parallel export/import. Once you have prepared the system for DistMon which hosts the "commDir" you'll have to provide the information of involved host machines in the "distribution_monitor_cmd.properties" file. Now when DistMon is executed it will distribute the export and import process across the systems which were defined in "distribution_monitor_cmd.properties" file automatically.
Best regards,
SUJIT -
Dear SAP GURUS
Info regarding import process:
Customer request: customer would like to implement the import process: the requirement is purchasing will order the parts
to the foreign vendor. after receivng the B/L,I/V, purchasing will do the GR and payment to the foreign vendor and the local import agent. the material will lie at the import agent warehouse ( because of high volume) which is the outside ware house for the plant.
the plant will run the MRP and confirm requirements will be sent to the import agent (in the form of the PO) which he has to deliver, the custome wants to do the GR ( for delivery control purpose) but not IR as payment has already been done to the
foreign vendor.
What I think is in the above requirement Double GR is happening which is not recommended. but the customer wants to send the weekly requirement to the import agent automatically as it is sending to the normal vendors through legacy systems ( in the form of PO)
In the above scenario only possibility I feel is stock transfer scenario from the outside storage location to the plant storage location.
If anyone knows any better solution for import process that will be really hepful.
Regards
RajivDear Vivek,
I know the same & i have check all the configuration .i.e cal. schema/Org .grp / Pricing procedure/ account key for CVD / vendor code /chapter id with material wise etc etc .but I am also not getting well in the system what is happing
if you hv then pls inform. -
Change/new/delete highlighting during import process
Hi everyone,
I intend to use SAP NW MDM to consolidate master data from external sources.
During the import process, I would like to see a highlighting of all new, of all deleted and of all changed records. The reason herefore is the following: before I actually load the imported data into my MDM repository, I want to check the data. Just in case, some corupt data was loaded.
Can I do this in the MDM Import Manager?
Best regards, Danielhi,
Whenever an imported record is new or changed, a timestamp could be set (which is most likely set already anyway). After importing the data I could run a routine which shows me all records that have not been updated. I could then delete these records or set them as "inactive".
What do you think about this workaround? Is it feasible in MDM?
Yes this is feasible in MDM. Whenever one record is changed / updated by user, the record will be updated along with the timestamp.
you can view this in data manager.
if you want to select some records and mark them as inactive, you have to create a field.
hope this may help you,
Regards,
Srinivas -
Placing an Import process in the Background
Hello -
I have an Import process that is currently running in the foreground on Sun Solaris, and I want to place the PID of the Imp process in the background. How do I place it in the background while it is running? Do I have to suspend, then place in the background?
Thanks.Not sure. I assume unix or linux being used.
If , for example you were editing a file in 'vi', how would you put that in the background and get a prompt?
Do a 'man' of 'fg' and 'bg'.
Can you put all the import parameters on the command line (or use a cmd file)and append the '&' symbol (above nummeral 7 on keyboard) to the line? Make sure to log import feedback to a file. -
User exit to run before importing abap transport in TMS
Hi,
I need to run a check before importing an ABAP transport.
I've seen a BADI that can be used after transport, also handy, but I really need to check dependencies before importing as well.
Is there a user exit or badi that can do this?
Thanks
TomHello,
This is possible in a crude away. SAP fires two events in the Transport Import process, SAP_IMPORT_START and SAP_IMPORT_STOP
You can define your actions according to your needs with jobs scheduled to be triggered on these events.
So basically you write your program to run the checks you need before import starts and take actions necessary.
However there are some limitations around this:
Check this link:
http://help.sap.com/saphelp_nw70ehp1/helpdata/en/ef/09745ec0c011d2b437006094b9ea64/frameset.htm
Regards,
Siddhesh -
Can you run the Import Wizard more than once at the same time?
We're going to run the Import Wizard to import content from one BO environment to another. Can we, at the same time, run the Import Wizard to import different content between those two environments (using the same source and destination both times)? Obviously we'd have to run the wizard on different workstations.
Would this cause any issues? If so, what kind of issues would you expect?
Thanks,
LauraYes, this is possible. The source and destination CMSs treat the requests from the Import Wizard like any other -- increasing the number of active Import Wizard sessions just increases the load on these CMSs. Before running multiple Import Wizard sessions, be sure there is ample processing power (CPU / memory) to handle the increased load. and of course, avoid 'touching' the same content in the two sessions in order to avoid unpredictable results.
-Mike -
Hi,
I have a retail website, and I am tweaking the final
processing page to make it run faster. Right now, our customers
wait up to 10 seconds after they hit the confirm button before they
see the 'transaction complete' page.
My idea is to take out some of the less important processing
and run them after the 'transaction complete' page is displayed to
customers.
I am on a shared CF7 hosting with cfschedule disabled. Is
there an alternative way to tell CF to run a script in the
background?
Thanks,
MinOne of the things I like about jQuery (I know this library
the best but imagine others work in a similar fashion) is that when
you submit a form via a jQuery function, jQuery uses the action
attribute of the form tag to determine where it submits *unless* a
submit to url is provided in the options object passed into the
ajaxForm function.
I haven't tried it before but you should be able to set the
form action attribute to one url, which would be used if JS is off,
and set another action url in options object that gets passed into
the ajax form function
Here's a sample of setting up an HTML form to submit via
jQuery (the HTML form would use an alternate action page).
$(document).ready
function()
var options = {
target: '#formOutput', // target element(s) to be updated
with server response
beforeSubmit: showFormRequest, // pre-submit callback
success: showFormResponse, // post-submit callback
resetForm: true,
// other available options:
url: 'htp://domain.com/blah/blah/blah.cfm', // override for
form's 'action' attribute
type: 'post', // 'get' or 'post', override for form's
'method' attribute
dataType: 'json' // 'xml', 'script', or 'json' (expected
server response type)
//clearForm: true // clear all form fields after successful
submit
//resetForm: true // reset the form after successful submit
// $.ajax options can be used here too, for example:
//timeout: 3000
// bind form using 'ajaxForm'
$('#myform').ajaxForm(options);
Might not be worth the effort of fit into your requirements
but wanted to put it out there for you. I have a jQuery-driven CMS
I built for a client that I could pull some code samples out for
you if that might help.
Cheers,
Craig
Maybe you are looking for
-
Rude Customer Services AGAIN!
Hi Everyone, Not sure what everyone's expereince is with customer services but today another argument and poor service from BT to the point of not wanting to be a BT customer ANYMORE! This is not exceptable behaviour from anyone no matter BT customer
-
Unable to create the creative cloud files folder.
I get this message when I try to sync files to Creative Cloud folder: I have tried the options, and to close the program and restart the computer, but no luck.
-
"Disc inserted does not have enough free space"
I am ready to burn my slideshow on a DVD. The info window in the project indicates that I am well within the parameters to fit on a DVD (4.7 GB). All the gauges are green. However, when I go to burn the disk I get the error message: "Disc inserted do
-
hi can anbody help me out,can u tell me,in what senario we will get the duplicate data record issue and what will be the solution? recently i joined the company and iam new to support work. please provide me step by step guidance.any help can be app
-
The E6 music player is practically useless-
…compared with the music player in E71 (S60 ver. 500.21.009) 1. Missing is the incremental real time search in any list of songs which searched in any part of the name - not just the beginning. It worked by just starting to type on the hardware keybo