Exporting / Importing a Site
Hi!
I have two servers: the first is to develop the sites, and the other is to publish the sites on Internet. I finished one site in the developing server and I want to export the site and mount it in the publishing server. The site have personalized portlets in JSP, applets, contents areas and others common objects.
How I can do it??
Regards,
FABIAN.
Where is the forum?
I can't find it
thanks,
Armando
<BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by Sue Vickers ([email protected]):
Fabian,
You may want to search for an answer to this question in the Content Areas forum.
thanks,
Sue<HR></BLOCKQUOTE>
null
Similar Messages
-
Offline Instantiation of a Materialized View Site Using Export/Import
Has anyone had any success performing offline instantiation of a materialized view site using export/import in Oracle9?
This is what I wanted to ask in the forum. I want to use datapump for the initial instantiation because I believe that indexes (the actual indextables, not just the definition) are also replicated using datapump.
So, is it possible to do the instantiation using datapump? -
In the same site is that possible to use stsadm export/import to copy subsite to a new diff subsite name?
I do not want to use the SaveAs Template method, I need command mode.You should create an empty site.
Go to your site collection and click siteactions --> new site --> Select custom template, click create.
Here you choose meeting2 as Title/url
Now execute your import command (Import-SPWeb http://vee:111/meeting2 -Path
C:\temp\meeting_exp.cmp -UpdateVersions -Overwrite)
Now wait, when the import is finished, check your site!
That should do it. -
Best choice for exporting / importing EUL
Hi all
I have been tasked with migrating an EUL from a R11 to a R12 environment. The Discoverer version on both environments is 10.1.2 and the OS is Solaris on oracle db's.
I am unfortunately not experienced with Discoverer and there seems to be no one available to assist for various reasons. So I have been reading the manual and forum posts and viewing metalink articles.
I tried exporting the entire eul via the wizard and then importing it to the new environment but i was not successfull and experienced the system hanging for many hours with a white screen and the log file just ended.
I assumed this was a memory problem or slow network issues causing this delay. Someone suggested I export import the EUL in pieces and this seemed to be effective but I got missing item warnings when trying to open reports. This piece meal approach also worried me regarding consistency.
So I decided to try do the full import on the server to try negate the first problem I experienced. Due to the clients security policies I am not able to open the source eul and send it to our dev. I was able to get it from their dev 11 system but I dismissed this as the dev reports were not working and the only reliable eul is the Prod one. I managed to get a prod eex file from a client resource but the upload to my server was extremely slow.
I asked the dba to assit with the third option of exporting a db dump of the eul_us and importing this into my r12 dev environment. I managed this but had to export the db file using sys which alleviated a priviledge problem when logging in, I have reports that run and my user can see the reports but there are reports that were not shared to sysadmin in the source enviroment are now prefixed with the version 11 user_id in my desktop and the user cannot see her reports only the sysadmin ones.
I refreshed the BA's using a shell script I made up which uses the java cmd with parameters.
After some re reading I tried selecting all the options in the validate menu and refreshing in the discover admin tool.
If I validate and refresh the BA using the admin tool I get the hanging screen and a lot of warnings that items are missing( so much for my java cmd refresh!) and now the report will not open and I see the substitute missing item dialogue boxes.
My question to the forum is which would be the best approach to migrate the entire eul from a R11 instance to a R12 instance in these circumstances?
Many thanks
Regards
NickHi Srini
The os and db details are as follows:
Source:
eBus 11.5.2
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
Target:
ebus 12.1.2
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production DEV12
SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
Yes the DBA initially did an exp for me using EUL_US as the owner but a strange thing happened with privileges and also, some of the imported tables appeared in the target environment under the apps schema(21 tables) even though the eul_us exp had been 48 tables.
I also had a problem on the db with "eul_us has insufficient privileges on table space discoverer" type errors.
I checked the eul_us db privileges and was unable to resolve this initial privilege error even though the privileges were granted to eul_us.
The dba managed to exp as system and then import it with the full=y flag in the import command which seems to bring in the privileges.
Then I ran the eul5_id.sql and then made up a list of the business areas and made a sh script to refresh the business areas as follows:
java -jar eulbuilder.jar -connect sysadmin/oracle1@dev -apps_user -apps_responsibility "System Administrator" -refresh_business_area "ABM Activities" -log refresh.log
This runs successfully and I can log in select business area and grant access to the users. The reports return data.
Then one of the users said she can't see all her reports. I noticed some if I opened desktop that were sitting there prefixed with a hash and her version 11 user id.
So back to the manuals and in the disco admin help the instructions are to first go to view > validate > select all options then go to the business area and click file refresh. This gives me a lot of warnings about items that are missing. I assume this is because the item identifiers brought across in the db dump are the version 11 ones and thus not found in the new system.
Any suggestions?
Many thanks
Nick -
Regarding Distribution Monitor for export/import
Hi,
We are planning to migrate the 1.2TB of database from Oracle 10.2g to MaxDB7.7 . We are currently testing the database migration on test system for 1.2TB of data. First we tried with just simple export/import i.e. without distribution monitor we were able to export the database in 16hrs but import was running for more than 88hrs so we aborted the import process. And later we found that we can use distribution monitor and distribute the export/import load on multiple systems so that import will get complete within resonable time. We used 2 application server for export /import but export completed within 14hrs but here again import was running more than 80hrs so we aborted the import process. We also done table splitting for big tables but no luck. And 8 parallel process was running on each servers i.e. one CI and 2 App servers. We followed the document DistributionMonitorUserGuide from SAP. I observerd that on central system CPU and Memory was utilizing above 94%. But on 2 application server which we added on that servers the CPU and Memory utilization was very low i.e. 10%. Please find the system configuration as below,
Central Instance - 8CPU (550Mhz) 32GB RAM
App Server1 - 8CPU (550Mhz) 16GB RAM
App Server2 - 8CPU (550Mhz) 16GB RAM
And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
Please can someone let me know how to improve the import time. And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
Thanks,
Narendra> And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
> Please can someone let me know how to improve the import time.
R3load connects directly to the database and loads the data. The quesiton is here: how is your database configured (in sense of caches and memory)?
> And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
There are no such documents available since the process of migration to another database is called "heterogeneous system copy". This process requires a certified migration consultant ot be on-site to do/assist the migraiton. Those consultants are trained specially for certain databases and know tips and tricks how to improve the migration time.
See
http://service.sap.com/osdbmigration
--> FAQ
For MaxDB there's a special service available, see
Note 715701 - Migration to SAP DB/MaxDB
Markus -
We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
exported using the export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites.
For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
to CAB and extracting it, the only files it contains are DAT files.
The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
This leads to two questions:
1.
Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
contains no XML files.
If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
Thanks in advance
Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.comI wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
ever happen to find a solution to this problem? -
XML Export/Import Query & Variables
Hello,
We are currently bringing up a new environment/landscape. With that, I'm trying to copy some of the query elements off our current production server and move them back to the new development server. After much wasted time trying to duplicate the query manually, I've decided to do it via XML Export/Import.
I have successfully created an XML file from our production system (sitting on my local machine). I then go to the import area on the development system and bring up the XML file I created. I then process the file on the development server. The import wizard tells me that it saves the objects successfully, but I'm unable to find any of the objects on the system. So, I have two questions.
1) Is this the proper way to do this, or is there a better way (assuming not on the transport path)
2) With the logs saying successful, where do those objects go? Is there a way that I can go find them? Do they need activated?
Thanks in advance,
JW
Edited by: J W on Sep 19, 2008 2:02 AMbump
-
Is time to export-import location dependent
I had a doubt that does the export-import process depend on the location from where it is performed. I will try to explain the scenario. There is export-import to be done of a server in Africa as the migration to a new server is required. My question is does it make a difference in the time it takes to export-import, if the SecureCRT session is logged on from here, Africa itself or if the process is done from a far off location, say, India as some of our DBA team is in India.
So, the doubt is does the export-import process depend on the location from where it is performed?
I hope, my question is clear.
Please, help in solving the doubt.
regardsYou need to clarify from an Oracle perspective what the client machine is in this scenario (i.e. the machine that is connecting to the Oracle database). What machine is the export utility running on and what machine is the export utility writing to?
If the export utility is running on the same machine and writing to a dump file on the same machine regardless of where the DBA is sitting, then where the DBA is sitting is irrelevant. If the export utility runs on a different machine or writes to a dump file on a different machine depending on where the DBA is sitting, then where the DBA is sitting is relevant.
Justin -
Export/import utility for portals: opeasst ; where to launch from?
hi all
does this script opeasst.csh have to be run from infra home or the midtier home? i think for exports of pages and portlets from 904, the CH 10 on export/import requires us to launch export from the mid-tier.
however, if i set up the midtier as my home, and intend to use the opeasst.csh, i do not see this script sitting in the directory where it's been suggested to reside:
ORACLE_HOME\portal\admin\plsql\wwu
instead it is in the ORACLE-infra home:
ORACLE_HOME\portal\admin\plsql\wwu
please help. it is URGENT.
I am using OracleAS 10g Portals 9.0.4 version.I've been doing some runs and had typical DB export's at about 12 GB per hour (compressed to ~7GB file). Typical DB I/0 bottle neck loosened by using different file system/disk to point export at than where the DB data files reside (CPU ~80% of single CPU)
now the import is a bigger item. -
Progamming export / import of XMP
Hi folks,
I'm still very new to this (just bought a book on Lua ) but I want to create something that export the XMP files (so that I can edit them on the road) and import them back to Lightroom. I've been reading the API reference and SDK manual lately, had a feeling that this is not possible but since I've seen some workarounds to achieve certain goals in this forum, thought still worth asking.
So, assuming that this is not possible, here are a few of my ideas:
Create some keyboard macros to automate the XMP export / import porcess. It'd nice and sweet if it's some simple application I'm dealing with, but with a complex app like Lr there are too many unpredictables and since it's all valuable data maybe this is not a good idea.
Create a plug-in that does not thing but generate XMP files using the post-processing part, and then import edited XMP manually. In this case though, I wonder if it's even possible to create a plug-in that will not export photos but only performs post processing.
Would anyone please shed some lights?
Thanks.
Nick.Hi Rob,
To be honest I just Googled your name and found your web site that contains plug-ins that do similar things of what I wanted to do, this is nice to know because I just wanted to know if there's a way to go around this -- I didn't want to waste my time putting codes that wouldn't eventually give me what I wanted, thus the intention to ask the question in the first place. ;-)
I have some development background dealing with XML version control and conflict management so merging online VS offline content is okay for me, -- since all I want to do at moment is editing keywords and titles (or any other description fields) maybe I'll just create some CRC from online files, and write any changed values in the offline file back to the online file.
How I plan to edit the XMP file was to create a XSLT stylesheet that converts XMP files into a spreadsheet so I can edit on the road. If it's too much work I might just create a PHP script that sits on my home PC host the XMP web edit interface so I can edit them from any device with a web browser -- with this method I might not even have to deal with XMP versioning since I'll probably have the PHP to read XMP and write changes in realtime.
The mortifying part is -- as you pointed out already -- using a keyboard macro isn't really elegant, but guess this is the only way to automate this kind of stuff...
Thanks.
Nick. -
How to export all the site files to the new computer
I'll be buying a new computer soon and need to export all the
site files (usernames and passwords) to that. I know how to export
individual site information, but the passwords are encrypted. Also,
is there way to save all sites at once and then just transfer that
to the new machine?This is a multi-part message in MIME format.
------=_NextPart_000_0039_01C852A0.8D4985F0
Content-Type: text/plain;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
Choose Site > Manage Sites and then use the Export feature
in the Site =
Manager. You'll have an option to export all settings
(including =
username and password). DW stores them in .ste files, which
you can =
later use the Site Manager to Import.=20
By the way, the Import/Export features work with multiple
files, just do =
a multi-select before clicking the Export button and select
multiple =
.ste files when importing.
Best - Joe
Joseph Lowery
Vice President of Marketing, WebAssist
Author of Dreamweaver CS3 Bible
"Mikkola" <[email protected]> wrote in
message =
news:[email protected]...
> I'll be buying a new computer soon and need to export
all the site =
files=20
> (usernames and passwords) to that. I know how to export
individual =
site=20
> information, but the passwords are encrypted. Also, is
there way to =
save all=20
> sites at once and then just transfer that to the new
machine?
>
------=_NextPart_000_0039_01C852A0.8D4985F0
Content-Type: text/html;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0
Transitional//EN">
<HTML><HEAD>
<META http-equiv=3DContent-Type content=3D"text/html; =
charset=3Diso-8859-1">
<META content=3D"MSHTML 6.00.6000.16587"
name=3DGENERATOR>
<STYLE></STYLE>
</HEAD>
<BODY>
<DIV><FONT face=3DArial size=3D2>Choose Site
> Manage Sites and then =
use the=20
Export feature in the Site Manager. You'll have an option to
export all =
settings=20
(including username and password). DW stores them in .ste
files, which =
you can=20
later use the Site Manager to Import.
</FONT></DIV>
<DIV><FONT face=3DArial
size=3D2></FONT> </DIV>
<DIV><FONT face=3DArial size=3D2>By the way, the
Import/Export features =
work with=20
multiple files, just do a multi-select before clicking the
Export button =
and=20
select multiple .ste files when
importing.</FONT></DIV>
<DIV><FONT face=3DArial
size=3D2></FONT> </DIV>
<DIV><FONT face=3DArial size=3D2>Best -
Joe</FONT></DIV>
<DIV><FONT face=3DArial
size=3D2></FONT> </DIV>
<DIV><FONT face=3DArial size=3D2>Joseph
Lowery<BR>Vice President of =
Marketing, <A=20
href=3D"
http://www.webassist.com">WebAssist</A><BR>Author
of <A=20
href=3D"
http://www.idest.com/dreamweaver/">Dreamweaver
CS3 =
Bible</A></FONT></DIV>
<DIV><FONT face=3DArial
size=3D2></FONT> </DIV>
<DIV><FONT face=3DArial size=3D2>"Mikkola"
<</FONT><A=20
href=3D"mailto:[email protected]"><FONT
face=3DArial=20
size=3D2>[email protected]</FONT></A><FONT
face=3DArial =
size=3D2>> wrote=20
in message </FONT><A =
href=3D"news:[email protected]"><FONT=20
face=3DArial =
size=3D2>news:[email protected]</FONT></A><FONT=20
face=3DArial size=3D2>...</FONT></DIV><FONT
face=3DArial size=3D2>> =
I'll be buying a=20
new computer soon and need to export all the site files
<BR>> =
(usernames and=20
passwords) to that. I know how to export individual site
<BR>> =
information,=20
but the passwords are encrypted. Also, is there way to save
all <BR>> =
sites=20
at once and then just transfer that to the new=20
machine?<BR>></FONT></BODY></HTML>
------=_NextPart_000_0039_01C852A0.8D4985F0-- -
How to Export/Import a batch job
Hi there,
I am wondering if there is/are transaction that can export/import a background job(s). Because, one of our production background jobs got deleted somehow and we have same job that sits in QA system, so we want to export that background jobs and import to Production system, since it has around 70+ steps
Thanks
KumarI'm not sure
but if these jobs can be put in a request and.. then transport..
U can try.. -
SYSTEM Copy with Data ( SAP Export/Import way)
Hello ,
There is requirement at my client site to build SAP system copy from Production System without copying Data but
should have Programs/ Structures / Repository/ Tables & views etc.
Though We have thought of building SAP system with Export/ Import way and then deleting the Client from copied system
after that running Remote client copy from Source to Target system with SAP_CUST profile
But I have heard with SAP Export/Import way , We can have SAP data copy skipped and only structure to copy. If there is any way
of such kind then Please help me in letting know the same
Thanks
Deepak GosainHi Deepak
Kindly refer the SCN link difference between the Client copy Export / import & Remote copy method
Difference between remote client copy and client Import/Export- Remote client copy steps
BR
SS -
Recovering the rules app from one export/import (HOWTO?)
Hello Aravind:
I'm having an issue with my rules app.
Our customer has a requirement. They want to export the production environment to the test environment, to have the data updated, and be able to reproduce recent issues to create a report.
The problem is, when we import the schema into the test environment, we are not able to use the rules app again, the results views is no longer valid because the RLM$SESSRSLTTTAB tables used to create this views not exists. So we have to manually delete all related objects to the rules app, an re-run the scripts to create all again.
The question is, if there is any procedure to re-create this internal objects, or a way to generate the export/import to avoid this.
Thnks in advance Aravind.
Alex.Alex,
Rules Manager applications can be exported and imported to a different database with a restriction that the destination schema is same as the schema from which the objects are exported. When a rules application is exported, the internal objects are skipped. At the time of import, all the internal objects are recreated. Note that these internal objects may now have different names (object numbers) on the import site. Please let me know if you this is not the behavior you are seeing when you use export/import.
http://download.oracle.com/docs/cd/B28359_01/appdev.111/b31088/exprn_brm_intro.htm#insertedID7
Regards,
-Aravind. -
Recommended workflow-way to manage exports for stock sites
Dear All,
Is there an efficient workflow - way that someone came up with, that could recomend to manage exports for stock site photography easily?
The main problem, is that if you upload 10 images for instance, to 3 different sites you most propably will face different accepted images and different rejected from each site.
Then you have to modify adjustments to reupload the rejected images for a second try, or even a third one.
And this is the exact point where the problem occures.
There must be an efficient way to have in different folders (or smart albums or whatever) the same images with different status (rejected or accepted for instance, or red or green or whatever) and at the same time those images with differenet settings (those you had to make in order to get images accepted) keep also in mind that the images may also have different keywords aswell (in case an image gets rejected due to keywording).
Its important to mention that in the end, the folders must match the real status of the images at the stocksite submitted, regarding state (accepted or not) and of course settings.(its important in the end to know that in site ''A'' you have exactly those images and in site ''B'' the others, and so on.)
Its more than sure that those 3 different folders (or smart albums or whatever) will eventually have the same images, but most of them will have different status, settings and keywords.
The only way i found to manage it, is by creating export folders using virtual copies at the publish services section of LR. Note that for publish services i used my hard drive.
But the drawbacks of this way are:
1. You double - triple (depending the stock site number) files (at least they are virtual ones)
2. When an image is rejected from a stock site and you mark the coresponding image as ''marked for republish'' even thow in lr it is clear which files were rejected and should be edited and republished, in the export folder you cant track them, you must export to a new folder evry time creating double - triple jpg files. I think it would be better if LR could sync automaticaly the folder showing somehow images marked for repubplish.
Sorry for long post, but how could it be smaller????
Thanks in advanced.
VasilisThe point of Publish is not that the image disappears afterwards, but that it remains there afterwards, after moving between the categories into which the publish window is divided. IIRC you can collapse the parts you are not interested in... (?) I don't have LR in front of me just now. Pictures then move categories again, the moment you make further changes. It's an ongoing relationship between the Catalog and the publish location, and you use Publish when this is what you want.
It is not very easy AFAICT to terminate this relationship, using the built-in Publish function, without also marking the externally published file for removal. However some 3rd-party LR plugins offering a Publish facility, can do this for you cleanly.
Personally, I have concluded that the standard one-off (but repeatable, via named preset) Export inherently meets my needs better than Publish does, for this very reason - and have therefore ducked fully dealing with this Publish life-cycle issue. So often, a change is made to an image which is not really substantive and does not justify re-publishing.
But I definitely suggest looking into plugins, if you like the idea of Publish but find it restrictive currently, including (in no particular order and not comprehensive)
http://www.photographers-toolbox.com/
http://regex.info/blog/lightroom-goodies
http://www.robcole.com/Rob/ProductsAndServices/PublishServiceAssistantLrPlugin/
Maybe you are looking for
-
This has been happening for about a month now.
-
Please help! Cannot access TC backups on new hard drive with Lion!
I know this seems like a generic posting, and believe me that I have searched through countless posts to find the situation most similar to mine. To start, the basic story (I am pretty computer illiterate, so excuse the lack of terminology): -My com
-
Problems accessing fields in a CMP entity bean from a session bean
Hello everybody, I'm getting the next problem: when I try to access a field in a CMP entity bean that I have instantiated from a session bean (trhoug entitybean.getNameOfField), I get the error "the entity bean does not exist in the database". This e
-
Error posting asset to company ''Specify payment period baseline date''
Hi All, Please some can advise on the below error msg. While posting the asset in to company I will get error like ''Specify payment period baseline date'' and i cant go further. Please advise ASAP. Many Thanks Balaji A S
-
IDOC mapping 1 to 0..unbounded problem in PI 7.1
All, We've recently implemented SAP PI 7.1 but we face an issue at the moment. Let me describe the case We have a Source message which is the DEBMAS IDOC. Within the IDOC we use mappings for Header data and for Detail Data. Mapping the header data is