DSCL Error on export and import of MCX parental controls
Hi I have been using the -mcxexport and -mcximport commands to copy local user parental controls in my labs but have hit a problem.
I need to use the commands to secure 250+ Macs over ARD. The commands have worked for me in the past but now I get Error: 14987
I have seen this error before but got around it by adding the .plist extention and making sure it had root access.
This has not worked. I am not a DSCL guru or a UNIX boff and am really stuck as to where this problem is.
I am currently in a lab of 20 where I get this error. I have another lab next door from a very similar build and that works fine.
sudo dscl . -mcxexport /Users/student /Users/admin/Documents/parental_controls.plist
Error:14987
I have used sudo and even run logged in as root. Executed locally and over ARD. Pretty stumped, and quite desperate!
The clients are 10.6.8
Any help would be really great!
Cheers
Carl
thanks - but that is effectivley what I did - used the repository import from my exported ZIP file - and it all imported OK - but the error on login to topology manager afterwards is that the snp user does not exist.
Although I initially used the 'Master Repository Creation' I then adaondoned this - dropped and recreated my Oracle user 'snpm' that owns the master repository and used the import repository approach.
I am trying to sign in as the standard SUPERVISOR user (that existed OK in the Hypersonic master repository that was exported to the ZIP file) and also trying other users that I know existed in the Hypersonic reporitory that I exported - same problem. I can see in Oracle SQL*Plus that in the SNP_USERS table the SUPERVISOR user exists.
If I do instead carry out a standard 'Create Master Repository' and do NOT try to import anything then it works OK and I can sign in to the Topology Manager as SUPERVISOR.
Similar Messages
-
Error when exporting and importing TS
Hi I am running SCCM 2012 SP1 and am trying to export a Task Sequence including with the Windows 7 wim, the boot wim, some applications and drivers. When I exported the TS I selected to export dependencies and content.
I have setup dev SCCM environment (same spec as production) and am trying to import the .zip that was created when I completed the export. When doing the import I am getting the below error on both the Windows 7 wim and the boot wim :
Error Imported Operating System Image (1):
Create new Windows 7 Enterprise x64
Generic failure
instance of SMS_ExtendedStatus
Description = "Failed to get the image property from the source WIM file due
to error 80070003".
ErrorCode= 2147942403;
File="e:\\nts_sccm_release\\sms\\siteserver\\sdk_provider\\smsprov\
\sspimagepackage.cpp";
Line = 483;
Operation = "Putinstance";
ParameterInfo = "";
ProviderName = "ExtnProv";
StatusCode = 2147749889;
The applications and drivers have been imported into the Dev SCCM server however I cannot find the actual content so I am not sure this has worked properly. I am wondering if I would be better off using the Migration wizard to take the TS and
content across. Would appreciate any advice on either a) how to fix the issues with the importing of the TS or b) whether I should just be using the migration wizard.
Thanks
GThe applications and drivers have been imported into the Dev SCCM server however I cannot find the actual content so I am not sure this has worked properly.
When you exported did you choose to "Export all content for selected task sequence and dependencies".
80070003 = The system cannot find the path specified.
See here for exporting and importing task sequences
http://blogs.technet.com/b/inside_osd/archive/2012/01/19/configuration-manager-2012-task-sequence-export-and-import.aspx
Gerry Hampson | Blog:
www.gerryhampsoncm.blogspot.ie | LinkedIn:
Gerry Hampson | Twitter:
@gerryhampson -
Error in Exporting and Importing Interface.
Hi Guys! warms greetings!
My problem is:
1] I have added a constraint in my datastore(table) at database level and reverse engg. but the metadata is not getting reflected in the datastore.
2] I then thought of dropping the entire datastore and re-reverse it ; for this I had to drop the Interface where I was using for the same datastore because while dropping datastore it thew "Referenced Error"
3] To do this I took a back up of my Interface by exporting it to some path and then deleted my original datastore.
4] I reverse enggd it all over again and now the constraint was visible.
5] I then Imported the interface which i had exported in above steps.
Now I am getting error like' No Dataserver is associated with the Column" for every column in my target.
Waiting for your wise comment/advise.
thankyou,
Diwakar.Hi Diwakar,
Earlier I had this issue,
That exported interface XML file has contains the details of the datastore which you have deleted for reversing the new modified datastore.
that Xml file has the ID of Old datastore, After imporing of that interface it will searching the respective datastores but it wont available. Because you did deleted that datastore.
I did that interface again to resove this issue.
another option is-> you need to modify the Exported XML file to Refer the new datastore detals like Id.
Exports comments are welcome
Thanks,
Madha. -
Exporting and Importing Destination Controls Error
When I export table within destination control and import the table to another Ironport again within destination control I receive error - "Wrong format of the destination config file: ip_sort_pref is required for the global settings."
We are updating the ASYNC on each of our de clustered Ironport. So when one is not in use we need the other to handle all the traffic, including domain controls.
The output displays
[ABCDE.com]
max_host_concurrency = 500
limit_apply = system
limit_type = host
max_messages_per_connection = 50 ......
How can I import this without losing or altering the data within.
^^^^^^^^^^^^^^^^^^^^^^
Solution
We upgrade one of our SMTP ASYNC to 8.0.1 which creates a additional line within default of the upgraded Ironport exported control destination table that is named... wait for it.... ip_sort_pref. Just add it.The applications and drivers have been imported into the Dev SCCM server however I cannot find the actual content so I am not sure this has worked properly.
When you exported did you choose to "Export all content for selected task sequence and dependencies".
80070003 = The system cannot find the path specified.
See here for exporting and importing task sequences
http://blogs.technet.com/b/inside_osd/archive/2012/01/19/configuration-manager-2012-task-sequence-export-and-import.aspx
Gerry Hampson | Blog:
www.gerryhampsoncm.blogspot.ie | LinkedIn:
Gerry Hampson | Twitter:
@gerryhampson -
Trapping in shell export and import errors
I'am exporting a table with a unix shell and reimporting data to another database .
Both database are ORACLE 11g
How i can handle errors during export or import ?
ThanksRead about LOGFILE and
[TABLE_EXISTS_ACTION|http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_import.htm#sthref575] -
Model Export and Import - Error 40144
I'm trying to transport a model from one system to another and I get the following error when I try the Model export and import functionality in IBP 4.0
"errorCode":"40144","errorMsg":"Repository: delivery unit not found; deliveryUnit: 'SOP_MODEL_EXPORT_DU', vendor:sap.com'"
Steps:
1. Model Export -> Create Export
2. Once the Export is created, I clicked 'Download Deployment Unit' and that's when I get the above error.
Am I doing anything wrong? If so, what is the procedure to execute the model export and import between systems?It turns out a transportation path has to be set up (one time) from the backend (by SAP) to enable this feature.
-
SQL Developer 2.1: Problem exporting and importing unit tests
Hi,
I have created several unit tests on functions that are within packages. I wanted to export these from one unit test repository into another repository on a different database. The export and import work fine, but when running the tests on the imported version, there are lots of ORA-06550 errors. When debugging this, the function name is missing in the call, i.e. it is attempting <SCHEMA>.<PACKAGE> (parameters) instead of <SCHEMA>.<PACKAGE>.<FUNCTION> (parameters).
Looking in the unit test repository itself, it appears that the OBJECT_CALL column in the UT_TEST table is null - if I populate this with the name of the function, then everything works fine. Therefore, this seems to be a bug with export and import, and it is not including this in the XML. The same problem happens whether I export a single unit test or a suite of tests. Can you please confirm whether this is a bug or whether I am doing something wrong?
Thanks,
Pierre.Hi Pierre,
Thanks for pointing this out. Unfortunately, it is a bug on our side and you have found the (ugly) "work-around".
Bug 9236694 - 2.1: OTN: UT_TEST.OBJECT_CALL COLUMN NOT EXPORTED/IMPORTED
Brian Jeffries
SQL Developer Team -
DHCP Server does not work after Exporting and Importing Using Netsh Command
Hello Friends :
I had two dhcp servers in windows server 2003 server , I have upgraded one of them to windows server 2008 32 bit and again i installed a windows server 2008 R2 as an additional Domain Controller , the last scenario was like this :
srv-1 : windows server 2003 + DHCP = working with no problem
srv-2 : windows server 2003 + DC + DHCP = Worked without problem
srv-3 : windows server 2008 R2 + DC = worked without any problems
I exported the DHCP server configuration on the srv-2 using netsh dhcp server export and Imported them to srv-3 using the
netsh dhcp server import command , the command completed successfully and i can see all of the scopes without any problems and errors , i have authorized the new server without any problem , all scopes are activated without any problem so i
disabled the srv-2 DHCP service and unauthorized it from active directory , the problem is that the new server semms that does not lease any address to clients !!!
1- I have authorized it
2- I used Rogue Checker tool in client computers they see authorized server without any problems
3- The same tool in workgroup only shows srv-1 as the DHCP server and does not see other DHCP servers
4- Bindings are OK and DHCP servers only have one NIC installed on them
What can i do to make sure my srv-3 DHCP server will work on the network ?
thanks ...
MIMOAre the clients on another network so you need to configure a DHCP relay agent?
If you load up perfmon on the dhcp server and remove all counters and then add DHCP counters. Do you see any dhcprequests when you reboot your dhcp clients? This will determine if your server actually receives any dhcp requests.
Have you check event viewer for any warnings or errors?
And the classic one restarted the dhcp server service (or reboot)?
Regards Per-Torben Sørensen http://pertorben.wordpress.com/ -
Export and import of data not table and data ????
hii brothers and sister
plz i have a a quetion about Export and import of data in oracle forms
i have created 02 boutons one for export his trigger like this :
eclare
alrt number;
v_directory varchar2(200) := 'c:\backup'; --- that if the C Drive not the Drive that the windows had installed in it.
path varchar2(100):='back_up'
||to_char(sysdate,'dd_mm_yyyy-hh24_mi_ss');
v_exp varchar2(200) := 'exp hamada/hamada2013@orcl file = '
||v_directory
||'\'
||path
||'.dmp';
begin
host(v_exp);
alrt:=show_alert('MSG');
end;
and the bouton import is like this
declare
alrt number ;
v_ixp varchar2(200) := 'imp userid=hamada/hamada2013@orcl file =c:\backup2\back.dmp full=yes';
begin
host(v_ixp);
alrt:=show_alert('MSG');
ref_list;
end;
i have just one table "phone"
this code is correct he expot not only the data but also the creation of the table ....for exemple i do export and evry thing is good until now and i find the .dmp in the folder backup .. but when i deleted all data from my app and try to import this .dmp iit show me error it tell me thet the table phone is already created ....
so plz help wanna just export the data of phone not the creation of table and data ???? or how can i import just the data from this .dmp ??Pl post OS and database versions.
You will likely need to use the IGNORE flag - http://docs.oracle.com/cd/E11882_01/server.112/e22490/original_import.htm#sthref2201
Imp utility (by default) will try and create the table first. Since the table already exists, imp reports an error. Use the IGNORE flag in your imp command to not report such errors.
HTH
Srini -
Export and Import of mappings/process flows etc
Hi,
We have a single repository with multiple projects for DEV/UAT and PROD of the same logical project. This is a nightmare for controlling releases to PROD and in fact we have a corrupt repository as a result I suspect. I plan to split the repository into 3 separate databases so that we have a design repos for DEV/UAT and PROD. Controlling code migrations between these I plan to use the metadata export and subsequent import into UAT and then PROD once tested. I have used this successfully before on a project but am worried about inherent bugs with metadata export/imports (been bitten before with Oracle Portal). So can anyone advise what pitfalls there may be with this approach, and in particular if anyone has experienced loss of metadata between export and import. We have a complex warehouse with hundreds of mappings, process flows, sqlldr flatfile loads etc. I have experienced process flow imports that seem to lose their links to the mappings they encapsulate.
Thanks for any comments,
BrandonThis should do the trick for you as it looks for "PARALLEL" therefore it only removes the APPEND PARALLEL Hint and leaves other Hints as is....
#set current location
set path "C:/TMP"
# Project parameters
set root "/MY_PROJECT"
set one_Module "MY_MODULE"
set object "MAPPINGS"
set path "C:/TMP
# OMBPLUS and tcl related parameters
set action "remove_parallel"
set datetime [clock format [clock seconds] -format %Y%m%d_%H%M%S]
set timestamp [clock format [clock seconds] -format %Y%m%d-%H:%M:%S]
set ext ".log"
set sep "_"
set ombplus "OMBplus"
set omblogname $path/$one_Module$sep$object$sep$datetime$sep$ombplus$ext
set OMBLOG $omblogname
set logname $path/$one_Module$sep$object$sep$datetime$ext
set log_file [open $logname w]
set word "PARALLEL"
set i 0
#Connect to OWB Repository
OMBCONNECT .... your connect tring
#Ignores errors that occur in any command that is part of a script and moves to the next command in the script.
set OMBCONTINUE_ON_ERROR ON
OMBCC "'$root/$one_Module'";
#Searching Mappings for Parallel in source View operators
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
foreach mapName [OMBLIST MAPPINGS] {
foreach opName [OMBRETRIEVE MAPPING '$mapName' GET TABLE OPERATORS] {
foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT)] {
if { [ regexp $word $prop1] == 1 } {
incr i
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
OMBCOMMIT;
foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
if {[regexp $word $prop2] == 1 } {
incr i
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
OMBCOMMIT;
foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET DIMENSION OPERATORS ] {
foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
if {[regexp $word $prop1] == 1 } {
incr i
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
OMBCOMMIT;
foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
if {[regexp $word $prop2] == 1 } {
incr i
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
OMBCOMMIT;
foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET CUBE OPERATORS ] {
foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
if {[regexp $word $prop1] == 1 } {
incr i
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
OMBCOMMIT;
foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
if {[regexp $word $prop2] == 1 } {
incr i
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
OMBCOMMIT;
foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET VIEW OPERATORS ] {
foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
if {[regexp $word $prop1] == 1 } {
incr i
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
OMBCOMMIT;
foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
if {[regexp $word $prop2] == 1 } {
incr i
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
OMBCOMMIT;
if { $i == 0 } {
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
} else {
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
close $log_file;
Enjoy!
Michel -
Export and Import of Access DB tables in Apex
Hi,
i need help.
I wanna export an MSACCESS DB Table into an ORACLE DB 10g2.
but every time i get an error course my ODBC driver which isnt compatible to my tns service name.
what could i do?
i only want to export a table..not the whole database.
if i want to export and import a Table over a XE Version, there are no errors and i could see the table in APEX.
...thx for help :)Hi Timo,
If you open your table in Access, and click the "select all" (the empty grey box at the left of the header row), you can copy the entire table and paste it into Apex as though you were copying from Excel - the structure of the copied data is the same. The only thing is that you will be limited to 32,000 characters.
Obviously, though, getting hold of, and setting up, the correct ODBC drivers is the more proper solution but the above may solve your immediate problem.
Regards
Andy -
View and Edit properties pages are not woriking after export and import in sharepoint2013
Hi,
After performing Export and Import in SharePoint 2013 for a subsite, metadata is missing in document library. 404 error triggers on clicking both View-properties of an item and /Forms/DisplayDocument.aspx.
Please help me in this.
Regards,
GurudattaWhile uploading, after browsing the document clicking on OK button giving below error.
Nothing is generated in Event logs. Below are sharepoint logs.
07/16/2014 17:32:08.44 w3wp.exe (0x5234) 0x499C SharePoint Foundation
Logging Correlation Data xmnv Medium Name=Request (GET:http://siteurl/frmFile4/Forms/CreateDocument.aspx?Mode=Upload&CheckInComment=&ID=963&RootFolder=/QAPortal/PackingMaterial/frmFile4&Source=http://siteurl/frmFile4/Forms/AllItems.aspx?InitialTabId=Ribbon%252EDocument&VisibilityContext=WSSTabPersistence&RootFolder=%252FQAPortal%252FPackingMaterial%252FfrmFile4&IsDlg=1) 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.44 w3wp.exe (0x5234) 0x499C SharePoint Foundation
Request Management adc7u Medium Mapping URI from 'http://siteurl/frmFile4/Forms/CreateDocument.aspx?Mode=Upload&CheckInComment=&ID=963&RootFolder=%2FQAPortal%2FPackingMaterial%2FfrmFile4&Source=http%3A%2F%2Fuslsp13%3A1111%2FQAPortal%2FPackingMaterial%2FfrmFile4%2FForms%2FAllItems%2Easpx%3FInitialTabId%3DRibbon%252EDocument%26VisibilityContext%3DWSSTabPersistence%26RootFolder%3D%252FQAPortal%252FPackingMaterial%252FfrmFile4&IsDlg=1'
to 'http://siteurl/frmFile4/Forms/CreateDocument.aspx?Mode=Upload&CheckInComment=&ID=963&RootFolder=%2FQAPortal%2FPackingMaterial%2FfrmFile4&Source=http%3A%2F%2Fuslsp13%3A1111%2FQAPortal%2FPackingMaterial%2FfrmFile4%2FForms%2FAllItems%2Easpx%3FInitialTabId%3DRibbon%252EDocument%26VisibilityContext%3DWSSTabPersistence%26RootFolder%3D%2... 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.44* w3wp.exe (0x5234) 0x499C SharePoint Foundation
Request Management adc7u Medium ...52FQAPortal%252FPackingMaterial%252FfrmFile4&IsDlg=1' 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.44 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Monitoring nasq Medium Entering monitored scope (Request (GET:http://siteurl/frmFile4/Forms/CreateDocument.aspx?Mode=Upload&CheckInComment=&ID=963&RootFolder=%2FQAPortal%2FPackingMaterial%2FfrmFile4&Source=http%3A%2F%2Fuslsp13%3A1111%2FQAPortal%2FPackingMaterial%2FfrmFile4%2FForms%2FAllItems%2Easpx%3FInitialTabId%3DRibbon%252EDocument%26VisibilityContext%3DWSSTabPersistence%26RootFolder%3D%252FQAPortal%252FPackingMaterial%252FfrmFile4&IsDlg=1)).
Parent No
07/16/2014 17:32:08.44 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Logging Correlation Data xmnv Medium Name=Request (GET:http://siteurl/frmFile4/Forms/CreateDocument.aspx?Mode=Upload&CheckInComment=&ID=963&RootFolder=%2FQAPortal%2FPackingMaterial%2FfrmFile4&Source=http%3A%2F%2Fuslsp13%3A1111%2FQAPortal%2FPackingMaterial%2FfrmFile4%2FForms%2FAllItems%2Easpx%3FInitialTabId%3DRibbon%252EDocument%26VisibilityContext%3DWSSTabPersistence%26RootFolder%3D%252FQAPortal%252FPackingMaterial%252FfrmFile4&IsDlg=1) 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.46 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Authentication Authorization agb9s Medium Non-OAuth request. IsAuthenticated=True, UserIdentityName=0#.w|usl\hclspp, ClaimsCount=23 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.50 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Information Rights Management 5202 Information Information Rights Management (IRM): Requesting user email address is empty. 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.50 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Files ak8dj High UserAgent not available, file operations may not be optimized.
at Microsoft.SharePoint.SPFileStreamManager.CreateCobaltStreamContainer(SPFileStreamStore spfs, ILockBytes ilb, Boolean copyOnFirstWrite, Boolean disposeIlb) at Microsoft.SharePoint.SPFileStreamManager.SetInputLockBytes(SPFileInfo&
fileInfo, SqlSession session, PrefetchResult prefetchResult) at Microsoft.SharePoint.CoordinatedStreamBuffer.SPCoordinatedStreamBufferFactory.CreateFromDocumentRowset(Guid databaseId, SqlSession session, SPFileStreamManager spfstm,
Object[] metadataRow, SPRowset contentRowset, SPDocumentBindRequest& dbreq, SPDocumentBindResults& dbres) at Microsoft.SharePoint.SPSqlClient.GetDocumentContentRow(Int32 rowOrd, Object ospFileStmMgr, SPDocumentBindRequest&
dbreq, SPDocumentBindResults& dbres... 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.50* w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Files ak8dj High ...) at Microsoft.SharePoint.Library.SPRequestInternalClass.GetFileAndMetaInfo(String
bstrUrl, Byte bPageView, Byte bPageMode, Byte bGetBuildDependencySet, String bstrCurrentFolderUrl, Int32 iRequestVersion, Byte bMainFileRequest, Boolean& pbCanCustomizePages, Boolean& pbCanPersonalizeWebParts, Boolean& pbCanAddDeleteWebParts, Boolean&
pbGhostedDocument, Boolean& pbDefaultToPersonal, Boolean& pbIsWebWelcomePage, String& pbstrSiteRoot, Guid& pgSiteId, UInt32& pdwVersion, String& pbstrTimeLastModified, String& pbstrContent, UInt32& pdwPartCount, Object&
pvarMetaData, Object& pvarMultipleMeetingDoclibRootFolders, String& pbstrRedirectUrl, Boolean& pbObjectIsList, Guid& pgListId, UInt32& pdwItemId, Int64& pllListFlags, Boolean& pbAccessDenied, Guid& pgDocid, Byte& piLevel,
UInt64& ppermMask, ... 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.50* w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Files ak8dj High ...Object& pvarBuildDependencySet, UInt32& pdwNumBuildDependencies,
Object& pvarBuildDependencies, String& pbstrFolderUrl, String& pbstrContentTypeOrder, Guid& pgDocScopeId) at Microsoft.SharePoint.Library.SPRequestInternalClass.GetFileAndMetaInfo(String bstrUrl, Byte bPageView, Byte
bPageMode, Byte bGetBuildDependencySet, String bstrCurrentFolderUrl, Int32 iRequestVersion, Byte bMainFileRequest, Boolean& pbCanCustomizePages, Boolean& pbCanPersonalizeWebParts, Boolean& pbCanAddDeleteWebParts, Boolean& pbGhostedDocument,
Boolean& pbDefaultToPersonal, Boolean& pbIsWebWelcomePage, String& pbstrSiteRoot, Guid& pgSiteId, UInt32& pdwVersion, String& pbstrTimeLastModified, String& pbstrContent, UInt32& pdwPartCount, Object& pvarMetaData, Object&
pvarMultipleMeetingDoclibRootFolders, String& pbst... 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.50* w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Files ak8dj High ...rRedirectUrl, Boolean& pbObjectIsList, Guid&
pgListId, UInt32& pdwItemId, Int64& pllListFlags, Boolean& pbAccessDenied, Guid& pgDocid, Byte& piLevel, UInt64& ppermMask, Object& pvarBuildDependencySet, UInt32& pdwNumBuildDependencies, Object& pvarBuildDependencies,
String& pbstrFolderUrl, String& pbstrContentTypeOrder, Guid& pgDocScopeId) at Microsoft.SharePoint.Library.SPRequest.GetFileAndMetaInfo(String bstrUrl, Byte bPageView, Byte bPageMode, Byte bGetBuildDependencySet, String
bstrCurrentFolderUrl, Int32 iRequestVersion, Byte bMainFileRequest, Boolean& pbCanCustomizePages, Boolean& pbCanPersonalizeWebParts, Boolean& pbCanAddDeleteWebParts, Boolean& pbGhostedDocument, Boolean& pbDefaultToPersonal, Boolean&
pbIsWebWelcomePage, String& pbstrSiteRoot, Guid& pgSiteId, UInt32& pdwVersion,... 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.50* w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Files ak8dj High ... String& pbstrTimeLastModified, String& pbstrContent,
UInt32& pdwPartCount, Object& pvarMetaData, Object& pvarMultipleMeetingDoclibRootFolders, String& pbstrRedirectUrl, Boolean& pbObjectIsList, Guid& pgListId, UInt32& pdwItemId, Int64& pllListFlags, Boolean& pbAccessDenied,
Guid& pgDocid, Byte& piLevel, UInt64& ppermMask, Object& pvarBuildDependencySet, UInt32& pdwNumBuildDependencies, Object& pvarBuildDependencies, String& pbstrFolderUrl, String& pbstrContentTypeOrder, Guid& pgDocScopeId)
at Microsoft.SharePoint.SPWeb.GetWebPartPageContent(Uri pageUrl, Int32 pageVersion, PageView requestedView, HttpContext context, Boolean forRender, Boolean includeHidden, Boolean mainFileRequest, Boolean fetchDependencyInformation, Boolean& ghostedPage,
String& siteRoot, Guid& siteId, Int64& bytes, ... 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.50* w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Files ak8dj High ...Guid& docId, UInt32& docVersion, String&
timeLastModified, Byte& level, Object& buildDependencySetData, UInt32& dependencyCount, Object& buildDependencies, SPWebPartCollectionInitialState& initialState, Object& oMultipleMeetingDoclibRootFolders, String& redirectUrl, Boolean&
ObjectIsList, Guid& listId) at Microsoft.SharePoint.ApplicationRuntime.SPRequestModuleData.FetchWebPartPageInformationForInit(HttpContext context, SPWeb spweb, Boolean mainFileRequest, String path, Boolean impersonate, Boolean&
isAppWeb, Boolean& fGhostedPage, Guid& docId, UInt32& docVersion, String& timeLastModified, SPFileLevel& spLevel, String& masterPageUrl, String& customMasterPageUrl, String& webUrl, String& siteUrl, Guid& siteId, Object&
buildDependencySetData, SPWebPartCollectionInitialState& initialState, ... 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.50* w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Files ak8dj High ...String& siteRoot, String& redirectUrl, Object&
oMultipleMeetingDoclibRootFolders, Boolean& objectIsList, Guid& listId, Int64& bytes) at Microsoft.SharePoint.ApplicationRuntime.SPRequestModuleData.GetFileForRequest(HttpContext context, SPWeb web, Boolean exclusion, String
virtualPath) at Microsoft.SharePoint.ApplicationRuntime.SPRequestModule.InitContextWeb(HttpContext context, SPWeb web) at Microsoft.SharePoint.WebControls.SPControl.SPWebEnsureSPControl(HttpContext context)
at Microsoft.SharePoint.ApplicationRuntime.SPRequestModule.GetContextWeb(HttpContext context) at Microsoft.SharePoint.ApplicationRuntime.SPRequestModule.PostResolveRequestCacheHandler(Object oSender, EventArgs ea)
at System.Web.HttpApplication.SyncEventExecutionStep.System.Web.HttpApplication.IEx... 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.50* w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Files ak8dj High ...ecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep
step, Boolean& completedSynchronously) at System.Web.HttpApplication.PipelineStepManager.ResumeSteps(Exception error) at System.Web.HttpApplication.BeginProcessRequestNotification(HttpContext context, AsyncCallback
cb) at System.Web.HttpRuntime.ProcessRequestNotificationPrivate(IIS7WorkerRequest wr, HttpContext context) at System.Web.Hosting.PipelineRuntime.ProcessRequestNotificationHelper(IntPtr rootedObjectsPointer, IntPtr
nativeRequestContext, IntPtr moduleData, Int32 flags) at System.Web.Hosting.PipelineRuntime.ProcessRequestNotification(IntPtr rootedObjectsPointer, IntPtr nativeRequestContext, IntPtr moduleData, Int32 flags)
at System.Web.Hosting.UnsafeIISMethods.MgdIndicateCompl... 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.50* w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Files ak8dj High ...etion(IntPtr pHandler, RequestNotificationStatus&
notificationStatus) at System.Web.Hosting.UnsafeIISMethods.MgdIndicateCompletion(IntPtr pHandler, RequestNotificationStatus& notificationStatus) at System.Web.Hosting.PipelineRuntime.ProcessRequestNotificationHelper(IntPtr
rootedObjectsPointer, IntPtr nativeRequestContext, IntPtr moduleData, Int32 flags) at System.Web.Hosting.PipelineRuntime.ProcessRequestNotification(IntPtr rootedObjectsPointer, IntPtr nativeRequestContext, IntPtr moduleData, Int32 flags)
9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.50 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Files aiv4w Medium Spent 0 ms to bind 13297 byte file stream 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.52 mssearch.exe (0x1B6C) 0x3F24 SharePoint Server Search Crawler:Content
Plugin ajjig Medium CSSFeeder::ReportPingSession: the session 86ba63af-ce5a-4c7c-a836-c70ee9961b91 has been pinged by PollCallbacks
07/16/2014 17:32:08.52 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Information Rights Management ai4ko Medium Information Rights Management (IRM): The IRM status of the document [] is: untried (-1), the return values is: 0x0. 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.52 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Files ahjkm Medium Spent 0 ms to send 13297 byte file stream 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.52 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Logging Correlation Data xmnv Medium Site=/ 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.52 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Monitoring b4ly High Leaving Monitored Scope (PostResolveRequestCacheHandler). Execution Time=66.9455 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.57 w3wp.exe (0x2CE0) 0x45CC SharePoint Server
Logging Correlation Data xmnv Medium Name=Task: SessionManager.PerformOngoingRequestDepartures 6b6b4445-2b72-0002-9138-988147c881a0
07/16/2014 17:32:08.58 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
General g3ql High [Forced due to logging gap, cached @ 07/16/2014 17:32:08.55, Original
Level: Verbose] GetUriScheme(/QAPortal/PackingMaterial/frmFile4) 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.58 w3wp.exe (0x5234) 0x1E7C SharePoint Foundation
Database 8acb High [Forced due to logging gap, Original Level: VerboseEx] Reverting to process
identity 9e13a59c-dffb-b08d-acd9-c283e837f4a5
07/16/2014 17:32:08.64 OWSTIMER.EXE (0x1ED4) 0x5B18 SharePoint Foundation Monitoring
nasq Medium Entering monitored scope -
Export and import XMLType table
Hi ,
I want to export one table which contain xmltype column form oracle 11.2.0.1.0 and import into 11.2.0.2.0 version.
I got following errors when i export the table , when i tried with exp and imp utility
EXP-00107: Feature (BINARY XML) of column ZZZZ in table XXXX.YYYY is not supported. The table will not be exported.
then i tried export and import pump.Exporting pump is working ,following is the log
Export: Release 11.2.0.1.0 - Production on Wed Oct 17 17:53:41 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
;;; Legacy Mode Active due to the following parameters:
;;; Legacy Mode Parameter: "log=<xxxxx>Oct17.log" Location: Command Line, Replaced with: "logfile=T<xxxxx>_Oct17.log"
;;; Legacy Mode has set reuse_dumpfiles=true parameter.
Starting "<xxxxx>"."SYS_EXPORT_TABLE_01": <xxxxx>/******** DUMPFILE=<xxxxx>Oct172.dmp TABLES=<xxxxx>.<xxxxx> logfile=<xxxxx>Oct17.log reusedumpfiles=true
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 13.23 GB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "<xxxxx>"."<xxxxx>" 13.38 GB 223955 rows
Master table "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
Dump file set for <xxxxx>.SYS_EXPORT_TABLE_01 is:
E:\ORACLEDB\ADMIN\LOCALORA11G\DPDUMP\<xxxxx>OCT172.DMP
Job "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully completed at 20:30:14
h4. I got error when i import the pump using following command
+impdp sys_dba/***** dumpfile=XYZ_OCT17_2.DMP logfile=import_vmdb_XYZ_Oct17_2.log FROMUSER=XXXX TOUSER=YYYY CONTENT=DATA_ONLY TRANSFORM=oid:n TABLE_EXISTS_ACTION=append;+
error is :
h3. KUP-11007: conversion error loading table "CC_DBA"."XXXX"*
h3. ORA-01403: no data found*
h3. ORA-31693: Table data object "XXX_DBA"."XXXX" failed to load/unload and is being skipped due to error:*
Please help me to get solution for this.CREATE UNIQUE INDEX "XXXXX"."XXXX_XBRL_XMLINDEX_IX" ON "CCCC"."XXXX_XBRL" (EXTRACTVALUE(SYS_MAKEXML(128,"SYS_NC00014$"),'/xbrl'))above index is created by us because we are storing file which like
<?xml version="1.0" encoding="UTF-8"?>
<xbrl xmlns="http://www.xbrl.org/2003/instance" xmlns:AAAAAA="http://www.AAAAAA.COM/AAAAAA" xmlns:ddd4217="http://www.xbrl.org/2003/ddd4217" xmlns:link="http://www.xbrl.org/2003/linkbase" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<link:schemaRef xlink:href="http://www.fsm.AAAAAA.COM/AAAAAA-xbrl/v1/AAAAAA-taxonomy-2009-v2.11.xsd" xlink:type="simple" />
<context id="Company_Current_ForPeriod"> ...I tried to export pump with and without using DATA_OPTIONS=XML_CLOBS too.Both time exporting was success but import get same KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL" error.
I tried the import in different ways
1. Create table first then import data_only (CONTENT=DATA_ONLY)
2. Import all table
In both way table ddl is created successfully ,but it fail on import data.Following is the log when i importing
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "aaaaa"."SYS_IMPORT_TABLE_02" successfully loaded/unloaded
Starting "aaaaa"."SYS_IMPORT_TABLE_02": aaaaa/********@vm_dba TABLES=tab_owner.bbbbb_XBRL dumpfile=bbbbb_XBRL_OCT17_2.DMP logfile=import_vmdb
bbbbb_XBRL_Oct29.log DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
*KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL"*
*ORA-01403: no data found*
ORA-31693: Table data object "tab_owner"."bbbbb_XBRL" failed to load/unload and is being skipped due to error:
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-26062: Can not continue from previous errors.
Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "aaaaa"."SYS_IMPORT_TABLE_02" completed with 1 error(s) at 18:42:26 -
hi i am having a test database 9i(9.2.0.1) on windows xp.i want to export a table from test database to development database which is 9i(9.2.0.6) on IBM AIX 5.2. till now i am able to export and import but suddenly i am getting 'TABLESPACE <tablespace name? doesn't exist' error.
the following is what i am getting
D:\>imp file=news.dmp fromuser='COUNTER' touser='IBT' tables='NEWS_MASTER'
Import: Release 9.2.0.1.0 - Production on Sat Oct 6 17:06:00 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Username: system@dev
Password:
Connected to: Oracle9i Enterprise Edition Release 9.2.0.6.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP and Oracle Data Mining o
tions
JServer Release 9.2.0.6.0 - Production
Export file created by EXPORT:V09.02.00 via conventional path
Warning: the objects were exported by COUNTER, not by you
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
import server uses US7ASCII character set (possible charset conversion)
. importing COUNTER's objects into IBT
IMP-00017: following statement failed with ORACLE error 959:
"CREATE TABLE "NEWS_MASTER" ("NEWS_DATE" DATE NOT NULL ENABLE, "SUBJECT" VAR"
"CHAR2(75) NOT NULL ENABLE, "DESCRIPTION" VARCHAR2(300) NOT NULL ENABLE, "PH"
"OTO" BLOB, "REG_NO" NUMBER(5, 0), "SYSTEM_IP" VARCHAR2(15) NOT NULL ENABLE)"
" PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 STORAGE(INITIAL 65536 FREEL"
"ISTS 1 FREELIST GROUPS 1) TABLESPACE "TESTTBS" LOGGING NOCOMPRESS LOB ("PHO"
"TO") STORE AS (TABLESPACE "TESTTBS" ENABLE STORAGE IN ROW CHUNK 8192 PCTVE"
"RSION 10 NOCACHE STORAGE(INITIAL 65536 FREELISTS 1 FREELIST GROUPS 1))"
IMP-00003: ORACLE error 959 encountered
ORA-00959: tablespace 'TESTTBS' does not exist
Import terminated successfully with warnings.
help me i want to know what happend actually?and what may be the solution?
thanks in advancetablespace TESTTBS doesn't exist in the database where your are importing.
Either create tablespace in target database or
use show=y option while importing to get table structure and create manually in target database then import and give ignore=y. -
Export and Import Integration Profile - Demantra
Hi All,
I am working on demantra workflow in export and import integration profiles through workflow.
I want to avoid a customization to integrate the export profile data created from series A and import the data into other import profile created from series B.
I have done the following:
1. Create an export integration profile from series A and when launch the workflow found the below:
1. view_name BIEO_view_name and table_name BIIO_table_name got created in transfer_query table.
when i did
select * from BIIO_view_name got records and when i did select * from BIIO_table_name, no data found.
2. I want to populate data into the BIIO_table_name created from export profile. Is there any standard process to populate the data into the biio_table_name from bieo_view_name without writing the custom pl/sql.
I need this data to populate into a import profile which got created through series B.
I want to avoid the customization of wriring a custom pl/sql and want to know is there any seeded process of integrating the export profile from series A to import profile from Series B.
Any help is appreciated.
Regards,
AK
Edited by: Ajaykunde on Nov 28, 2009 8:53 AMHi,
Before to import your BS, did you import the Technical System?
Before to import your Technical System, did you import the SWCV ?
Before to import your SWCV, did you import the Product?
if no, you have to respect a sequence for the import: from highest to lowest level
if answer is yes. then In your BS defined in Dev, do you have a transport Group which says "BS_dev -> BS_Qua"?
If yes, both BS have to be defined in Qua SLD, else your transport link... has no link.
Moreover, when you import something, normally you have a detailed error message which explain you what is missing.
>> Question: How many business system should i maintain in SLD?
no limit I think.
>> Question: Should i do any config at the Transport Group?
Only if you want that your BS_dev becomes automatically a BS_Qua after the transport.... so for me, it's yes
regards.
Mickael
Maybe you are looking for
-
Can i have 2 iPhones with the same apple id?
can i have 2 iphones with the same apple id?
-
I have a licensed/registered Adobe Creative Suite 4 Web Premium with Acrobat 9 Pro. Can I upgrade this for free - and how please? Thanks - [email protected]
-
How do I change frame sizes to user-defined frame sizes
Hi, I hope someone can help me. I dont know how to change my project frame size to a user-defined frame size. ( like from 1920x1080p to 575x900p ) Ive found nowhere a answer to this question.
-
User interface with Netflix is clumsy
I just got Apple TV so I could watch Netflix in HD instead of SD using the Wii. I am very surprised that Apple TV has such a cumbersome navigation when using Netflix. Using a Wii it was relatively easy to read what a movie was about without having to
-
I need something that doesn't require a disk for setup.