Data files removed during language import
Hi,
While running a language import on ECC 6.0, someone removed the data files from the /usr/sap/trans/data directory.
We are facing 2 problems now :
1. the language that must be imported is now locked in SMLT (no lock exists in SM12 as described in help)
2. we can't find how to recreate the datafile from the .PAT file located on the install DVD. (some people have told me they did it before, but can't find back how)
Can someone please help ?
Tx in advance
Patrice.
What I know is,
If the datafiles deleted we can't add manually. better to install. before that taka a backup of existing one.
regards
ganesh
Similar Messages
-
Dump DATASET_REOPEN during language import
Hi,
i'am installing the language norway on a ECC 6.0 System.
I'am in the step import patches. I have imported ~200 patches but while importing the last 8 patches there is a problem.
During the import of SAPKE60064 the system dumps with the error DATASET_REOPEN.
File /usr/sap/trans/EPS/in/CSN0120061532_0048195.PAT is already opened.
I have searched but i can't find where the file is already opened.
I hope someone can help.
Thanks
JensHi Jens
We had the same/similar problem (with French) and solved this issue by doing the following which we found in note 975891:
"For this reason, first use transaction SAINT to unpack this package manually by choosing 'Utilities-> Disassemble OCS Package' from the menu."
-> We did this for every missing eight packages which caused the dump you described later in the process of importing the patches.
Best wishes
Nik -
SQL*Loader to insert data file name during load
I'd like to use a single control file to load data from different files (at different times) to the same table. I'd like this table to have a column to hold the name of the file the data came from. Is there a way for SQL*Loader to automatically do this? (I.e., as opposed to running an update query separately.) I can edit the control file before each load to set a CONSTANT to hold the new data file name, but I'd like it to pick this up automatically.
Thanks for any help.
-- HarveyHello Harvey.
I've previously attempted to store a value into a global/local OS variable and use this within a SQL*Loader control file (Unix OS and Oracle versions between 7.3.4 and 10g). I was unsuccessful in each attempt and approach I could imagine. It was very easy to use a sed script to make a copy of the control file, changing a string within it to do this however.
Do you really want to store a file name on each and every record? Perhaps an alternative would be to use a relational model. Create a file upload log table that would store the file name and an upload # and then have the SQL*Loader control file call a function that would read that table for the most recent upload #. You'll save some disk space too.
Hope this helps,
Luke -
Data access error during workflow- import in FDM
I am trying to perform import in FDM.But getting 'Data Access Error'.
Any suggestions on this......Hello,
I have the same problem.
Can anyone help.
Here is the log error:
** Begin FDM Runtime Error Log Entry [2009-01-30-12:44:44] **
ERROR:
Code......................................... -2147217900
Description.................................. ORA-06550: line 2, column 13:
PL/SQL: ORA-00913: too many values
ORA-06550: line 2, column 1:
PL/SQL: SQL Statement ignored
ORA-06550: line 3, column 13:
PL/SQL: ORA-00913: too many values
ORA-06550: line 3, column 1:
PL/SQL: SQL Statement ignored
ORA-06550: line 4, column 13:
PL/SQL: ORA-00913: too many values
ORA-06550: line 4, column 1:
PL/SQL: SQL Statement ignored
ORA-06550: line 5, column 13:
PL/SQL: ORA-00913: too many values
ORA-06550: line 5, column 1:
PL/SQL: SQL Statement ignored
ORA-06550: line 6, column 13:
PL/SQL: ORA-00913: too many values
ORA-06550: line 6, column 1:
PL/SQL: SQL Statement ignored
ORA-06550: line 7, column 13:
PL/SQL: ORA-00913: too many values
ORA-06550: line 7, column 1:
PL/SQL: SQL Statement ignored
ORA-06550: line 8, column 13:
PL/SQL: ORA-00913: too many values
ORA-06550: line 8, column 1:
PL/SQL: SQL Statement ignored
ORA-06550: line 9, column 13:
PL/SQL: ORA-00913: too many values
ORA-06550: line 9, column 1:
PL/SQL: SQL Statement ignored
ORA-06550: line 10, column 13:
BEGIN
INSERT INTO tWselhou3242683022 (PartitionKey, CatKey, PeriodKey, DataView, Account, Entity, ICP, UD1, UD2, UD3, UD4, UD5, UD6, UD7, UD8, UD9, UD10, UD11, UD12, UD13, UD14, UD15, UD16, UD17, UD18, UD19, UD20, Desc1, Desc2, Attr1, Attr2, Attr3, Attr4, Attr5, Attr6, Attr7, Attr8, Attr9, Attr10, Attr11, Attr12, Attr13, Attr14, MemoKey, Amount ) VALUES (752, 12, '20090131', 'YTD', 'Required Field Missing.', 'Required Field Missing.', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', 0, 45645656,00);
INSERT INTO tWselhou3242683022 (PartitionKey, CatKey, PeriodKey, DataView, Account, Entity, ICP, UD1, UD2, UD3, UD4, UD5, UD6, UD7, UD8, UD9, UD10, UD11, UD12, UD13, UD14, UD15, UD16, UD17, UD18, UD19, UD20, Desc1, Desc2, Attr1, Attr2, Attr3, Attr4, Attr5, Attr6, Attr7, Attr8, Attr9, Attr10, Attr11, Attr12, Attr13, Attr14, MemoKey, Amount ) VALUES (752, 12, '20090131', 'YTD', 'Required Field Missing.', 'Required Field Missing.', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', 0, 456546564,00);
INSERT INTO tWselhou3242683022 (PartitionKey, CatKey, PeriodKey, DataView, Account, Entity, ICP, UD1, UD2, UD3, UD4, UD5, UD6, UD7, UD8, UD9, UD10, UD11, UD12, UD13, UD14, UD15, UD16, UD17, UD18, UD19, UD20, Desc1, Desc2, Attr1, Attr2, Attr3, Attr4, Attr5, Attr6, Attr7, Attr8, Attr9, Attr10, Attr11, Attr12, Attr13, Attr14, MemoKey, Amount ) VALUES (752, 12, '20090131', 'YTD', 'Required Field Missing.', 'Required Field Missing.', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', 0, 4556465454,00);
INSERT INTO tWselhou3242683022 (PartitionKey, CatKey, PeriodKey, DataView, Account, Entity, ICP, UD1, UD2, UD3, UD4, UD5, UD6, UD7, UD8, UD9, UD10, UD11, UD12, UD13, UD14, UD15, UD16, UD17, UD18, UD19, UD20, Desc1, Desc2, Attr1, Attr2, Attr3, Attr4, Attr5, Attr6, Attr7, Attr8, Attr9, Attr10, Attr11, Attr12, Attr13, Attr14, MemoKey, Amount ) VALUES (752, 12, '20090131', 'YTD', 'Required Field Missing.', 'Required Field Missing.', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', 0, 546,00);
INSERT INTO tWselhou3242683022 (PartitionKey, CatKey, PeriodKey, DataView, Account, Entity, ICP, UD1, UD2, UD3, UD4, UD5, UD6, UD7, UD8, UD9, UD10, UD11, UD12, UD13, UD14, UD15, UD16, UD17, UD18, UD19, UD20, Desc1, Desc2, Attr1, Attr2, Attr3, Attr4, Attr5, Attr6, Attr7, Attr8, Attr9, Attr10, Attr11, Attr12, Attr13, Attr14, MemoKey, Amount ) VALUES (752, 12, '20090131', 'YTD', 'Required Field Missing.', 'Required Field Missing.', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', 0, 654,00);
INSERT INTO tWselhou3242683022 (PartitionKey, CatKey, PeriodKey, DataView, Account, Entity, ICP, UD1, UD2, UD3, UD4, UD5, UD6, UD7, UD8, UD9, UD10, UD11, UD12, UD13, UD14, UD15, UD16, UD17, UD18, UD19, UD20, Desc1, Desc2, Attr1, Attr2, Attr3, Attr4, Attr5, Attr6, Attr7, Attr8, Attr9, Attr10, Attr11, Attr12, Attr13, Attr14, MemoKey, Amount ) VALUES (752, 12, '20090131', 'YTD', 'Required Field Missing.', 'Required Field Missing.', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', 0, 546,00);
INSERT INTO tWselhou3242683022 (PartitionKey, CatKey, PeriodKey, DataView, Account, Entity, ICP, UD1, UD2, UD3, UD4, UD5, UD6, UD7, UD8, UD9, UD10, UD11, UD12, UD13, UD14, UD15, UD16, UD17, UD18, UD19, UD20, Desc1, Desc2, Attr1, Attr2, Attr3, Attr4, Attr5, Attr6, Attr7, Attr8, Attr9, Attr10, Attr11, Attr12, Attr13, Attr14, MemoKey, Amount ) VALUES (752, 12, '20090131', 'YTD', 'Required Field Missing.', 'Required Field Missing.', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', 0, 654,00);
INSERT INTO tWselhou3242683022 (PartitionKey, CatKey, PeriodKey, DataView, Account, Entity, ICP, UD1, UD2, UD3, UD4, UD5, UD6, UD7, UD8, UD9, UD10, UD11, UD12, UD13, UD14, UD15, UD16, UD17, UD18, UD19, UD20, Desc1, Desc2, Attr1, Attr2, Attr3, Attr4, Attr5, Attr6, Attr7, Attr8, Attr9, Attr10, Attr11, Attr12, Attr13, Attr14, MemoKey, Amount ) VALUES (752, 12, '20090131', 'YTD', 'Required Field Missing.', 'Required Field Missing.', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', 0, 564,00);
INSERT INTO tWselhou3242683022 (PartitionKey, CatKey, PeriodKey, DataView, Account, Entity, ICP, UD1, UD2, UD3, UD4, UD5, UD6, UD7, UD8, UD9, UD10, UD11, UD12, UD13, UD14, UD15, UD16, UD17, UD18, UD19, UD20, Desc1, Desc2, Attr1, Attr2, Attr3, Attr4, Attr5, Attr6, Attr7, Attr8, Attr9, Attr10, Attr11, Attr12, Attr13, Attr14, MemoKey, Amount ) VALUES (752, 12, '20090131', 'YTD', 'Required Field Missing.', 'Required Field Missing.', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', 0, 564,00);
END;
Procedure.................................... clsDataManipulation.fExecuteDML
Component.................................... upsWDataWindowDM
Version...................................... 931
Thread....................................... 268
IDENTIFICATION:
User......................................... selhouda
Computer Name................................ FR900VM0044
App Name..................................... test
Client App................................... WebClient
CONNECTION:
Provider..................................... ORAOLEDB.ORACLE
Data Server..................................
Database Name................................ PARHYPT3.WORLD
Trusted Connect.............................. False
Connect Status.. Connection Open
GLOBALS:
Location..................................... LOCATION4
Location ID.................................. 752
Location Seg................................. 6
Category..................................... WLCAT
Category ID.................................. 12
Period....................................... janv - 2009
Period ID.................................... 31/01/2009
POV Local.................................... False
Language..................................... 1033
User Level................................... 1
All Partitions............................... True
Is Auditor................................... False
** Begin FDM Runtime Error Log Entry [2009-01-30-12:44:44] **
ERROR:
Code......................................... -2147217900
Description.................................. Data access error.
Procedure.................................... clsImpDataPump.fProcessSQLInsertValues
Component.................................... upsWObjectsDM
Version...................................... 931
Thread....................................... 268
IDENTIFICATION:
User......................................... selhouda
Computer Name................................ FR900VM0044
App Name..................................... test
Client App................................... WebClient
CONNECTION:
Provider..................................... ORAOLEDB.ORACLE
Data Server..................................
Database Name................................ PARHYPT3.WORLD
Trusted Connect.............................. False
Connect Status.. Connection Open
GLOBALS:
Location..................................... LOCATION4
Location ID.................................. 752
Location Seg................................. 6
Category..................................... WLCAT
Category ID.................................. 12
Period....................................... janv - 2009
Period ID.................................... 31/01/2009
POV Local.................................... False
Language..................................... 1033
User Level................................... 1
All Partitions............................... True
Is Auditor................................... False
** Begin FDM Runtime Error Log Entry [2009-01-30-12:44:44] **
ERROR:
Code......................................... -2147217900
Description.................................. Data access error.
Procedure.................................... clsImpDataPump.fImportTextFile
Component.................................... upsWObjectsDM
Version...................................... 931
Thread....................................... 268
IDENTIFICATION:
User......................................... selhouda
Computer Name................................ FR900VM0044
App Name..................................... test
Client App................................... WebClient
CONNECTION:
Provider..................................... ORAOLEDB.ORACLE
Data Server..................................
Database Name................................ PARHYPT3.WORLD
Trusted Connect.............................. False
Connect Status.. Connection Open
GLOBALS:
Location..................................... LOCATION4
Location ID.................................. 752
Location Seg................................. 6
Category..................................... WLCAT
Category ID.................................. 12
Period....................................... janv - 2009
Period ID.................................... 31/01/2009
POV Local.................................... False
Language..................................... 1033
User Level................................... 1
All Partitions............................... True
Is Auditor................................... False
** Begin FDM Runtime Error Log Entry [2009-01-30-12:44:44] **
ERROR:
Code......................................... -2147217900
Description.................................. Data access error.
Procedure.................................... clsImpProcessMgr.fLoadAndProcessFile
Component.................................... upsWObjectsDM
Version...................................... 931
Thread....................................... 268
IDENTIFICATION:
User......................................... selhouda
Computer Name................................ FR900VM0044
App Name..................................... test
Client App................................... WebClient
CONNECTION:
Provider..................................... ORAOLEDB.ORACLE
Data Server..................................
Database Name................................ PARHYPT3.WORLD
Trusted Connect.............................. False
Connect Status.. Connection Open
GLOBALS:
Location..................................... LOCATION4
Location ID.................................. 752
Location Seg................................. 6
Category..................................... WLCAT
Category ID.................................. 12
Period....................................... janv - 2009
Period ID.................................... 31/01/2009
POV Local.................................... False
Language..................................... 1033
User Level................................... 1
All Partitions............................... True
Is Auditor................................... False -
Urgent: ORA-01403: no data found Error during Order Import
Hi Experts,
I 'am performing order import by populating the order interface tables and then running the order import concurrent program. I 'am encountering the following error while running the order import program:
No. of orders failed: 1
"*ora-01403: no data found in package oe_order_pvt procedure lines*"
Can anyone please provide some pointers on why this is occurring and how this can be overcome. Any pointers on this will be immensely helpful.
Thanks,
GanapathiHi Nagamohan,
Thanks for your response. I tried calling mo_global.set_policy_context('S', <org_id>) before invoking the concurrent program using fnd_request.submit_request(...); But, this still does n't seem to prevent the No data found issue.
One more thing that I noticed is, this is happening while importing the customer along with the order. I 've ensured that I use the same org_id for the customer data as well. Can you please let me know whether there is anything else that I should check for.
Thanks,
Ganapathi -
GRC AC v5.3 CUP - Initial Data Files
Hi All,
re: GRC AC v5.3 CUP - Initial Data Files
Our GRC-AC v5.3 CUP Dev system has too many roles and we want it to match our GRC-AC v5.3 CUP QA system. We do not want to go through one role at a time and delete them. Would it be the correct procedure to export the CUP QA system "Initial Data" files for "Roles" and import into CUP DEV using the setting "Clean and Insert".
Both systems are on the same CUP Version / SP / Build.
Any help on this would be greatly appreciated.
Thanks,
John>
John Stephens wrote:
> Hi All,
>
> Yes, you are correct. An attempt gives the following error message:
>
> "x Please select Insert or Append option, to avoid Data Integrity errors, Clean and Insert option is not available."
>
> We came up with some additional options that we will pursue to resolve this.
>
> Thanks for your help on this.
>
> -john
Hi John,
Can you elaborate on what options you pursued to do mass role removal in CUP?
Thanks!
Jes -
How to store data file name in one of the columns of staging table
My requirement is to load data from .dat file to oracle staging table. I have done following steps:
1. Created control file and stored in bin directory.
2. Created data file and stored in bin directory.
3. Registered a concurrent program with execution method as SQL*Loader.
4. Added the concurrent program to request group.
I am passing the file name as a parameter to concurrent program. When I am running the program, the data is getting loaded to the staging table correctly.
Now I want to store the filename (which is passed as a parameter) in one of the columns of staging table. I tried different ways found through Google, but none of them worked. I am using the below control file:
OPTIONS (SKIP = 1)
LOAD DATA
INFILE '&1'
APPEND INTO TABLE XXCISCO_SO_INTF_STG_TB
FIELDS TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
COUNTRY_NAME
,COUNTRY_CODE
,ORDER_CATEGORY
,ORDER_NUMBER
,RECORD_ID "XXCISCO_SO_INTF_STG_TB_S.NEXTVAL"
,FILE_NAME CONSTANT "&1"
,REQUEST_ID "fnd_global.conc_request_id"
,LAST_UPDATED_BY "FND_GLOBAL.USER_ID"
,LAST_UPDATE_DATE SYSDATE
,CREATED_BY "FND_GLOBAL.USER_ID"
,CREATION_DATE SYSDATE
,INTERFACE_STATUS CONSTANT "N"
,RECORD_STATUS CONSTANT "N"
I want to store file name in the column FILE_NAME stated above. I tried with and without constant using "$1", "&1", ":$1", ":&1", &1, $1..... but none of them worked. Please suggest me the solution for this.
Thanks,
AbhayPl post details of OS, database and EBS versions. There is no easy way to achieve this.
Pl see previous threads on this topic
SQL*Loader to insert data file name during load
Sql Loader with new column
HTH
Srini -
Tungsten T3 - need to access DAT file with addresses - Please Help
I have an old Palm Tungsten T3. The desktop computer with the synced address data has died and the T3 has lost power (so no addresses anymore).
I have a .DAT file backup with all my addresses. Can anyone advise how I can get access to these.
Reinstalling Palm Desktop 4.1.4e did not help as it doesn't allow me to import .DAT files.
Can I import this file into any other app or open up in Excel etc?
Any advice would be much appreciated.
Thanks in advance.
Post relates to: Tungsten T3
This question was solved.
View Solution.You have to physically copy the .dat file into the User directory over the existing address.dat file. Find the one with your hotsync name abbreviated (i.e. - C:/Program Files/Palm (or PalmOne)/ your Hotsync name /address) and copy the .dat file there.
Be sure you copy the file and keep another one safe in case something goes wrong. The existing blank address.dat file will be only about 2k in size.
Good luck!
WyreNut
I am a Volunteer here, not employed by HP.
You too can become an HP Expert! Details HERE!
If my post has helped you, click the Kudos Thumbs up!
If it solved your issue, Click the "Accept as Solution" button so others can benefit from the question you asked! -
Can I avoid FCPX to explode data files
Dear all,
I am using FCPX in a 6-cam multicam mode on my macbook air.
The files of each cam are app. 1,25 GB each, so it should not be a problem to have 6 of these on the macbook as I have about 100 GB spare.
however, whenever FCPX imports these files it creates a 9,7 GB for each file on the harddrive.
I can not see the reason why FCPX would explode the size of the data by a factor 8. There is no additional information added or anything.
Clearly, with now 60GB of files (in stead of 7) the memory becomes an issue and I get the 'not enough memory' warning...
Does anyone know why FCPX explodes the size of the data files or even more importantly how this can be avoided?
In a similar problem, if I copy one of these files to the macbook it takes about 20 mins.
If I import one of these files into FCPX and want to save the project as a movie without any editing, and in exactly the same format, frame-rate etc it takes 3 hours. And in essence, there are no calculations to be performed, just take the file and make a copy. Why is FCPX spending all this processing time?
All in all it seesm that FCPX is almost microsoft like in wasting memory and processing power...
Any support would be appreciated.
mad_boy (not as mad as is sounds)I do have the data on an external drive. In fact every recording is very long and is split into 22 files of 1,2 GB by the camcorder (hence the total video data from the 6 camcorders is app. 180 GB).
However I only work with one 1,2GB section from each camera to reduce the data..
However, to come back to your suggestion,
The data is indeed on an external 1TB drive. The problem is that FCPX continuously once I import a file maks a 9,6 GB file on the local SSD drive. I have not found a setting where I could tell it to make this file in the external drive. If you can point me in the correct direction, that would be a great help! -
Hi all,
I went along with the installation of French language on ECC6 netweaver 04s. The system was installed as ECC6 SR2. The DVD that I used belonged to "Language NetWeaver 2004s SR2 FR". We see the tcode descriptions and user menu in English and missing in French.
Is this because we require a "Language ECC 600 / SR2" DVD instead of the "netweaver 04s SR2" dvd during language import?
I have been able to locate "Language ECC 600 / SR3" DVD on marketplace. Could I proceed with ECC language import immediately after "Language NetWeaver 2004s SR2 FR".
Please share your thoughts!
Thank You,
AntarpreetHi All,
Just to add, We have the non-unicode installation!
Thank You,
Antarpreet -
Generate .DAT file for CG33
Hi Experts,
Can you please explain in details how can we generate .DAT file to use for importing specification through CG33.
Thanks in Advance.
Regards
KapilHello Kapil,
You can create a *-DAT file by Exporting a Specification from TR CG02 see
http://help.sap.com/erp2005_ehp_04/helpdata/en/a7/287c910a6c11d28a220000e829fbbd/content.htm?frameset=/en/a1/4eeaa2eaab5f4ebd8272e143831f06/frameset.htm¤t_toc=/en/c1/eda0f591ec12408b25e7a1b369ca45/plain.htm&node_id=390
Please read this post as well: http://scn.sap.com/thread/1654090
There used to be a tool called DATA Editor from TechniData/SAP that created a DAT file based on an Excel file.
- but is available as a Migration Serice which you can order from SAP via your SAP Account Manager.
Regards
Mark -
Power of File Naming during import
Can you append the HH (Hour) MM (Minute) SS (Second) that the photo was shot to the name while importing?
I have read that Aperture will not allow files to be renamed after in the Library to follow along with being non destructive. However I do not see renaming files or appending Meta Data in IPTC as being destructive and should not be ignored if the Photographer desires that to be part of his/her workflow.
I am sure that some third party programmers will write Automator scripts to accomplish many of the things that Aperture will be lacking in.
I am curious as to the power and functionality of Naming digital files as they are imported. I know that they can be tagged with YYMMDD. I have Read the Getting Started but want to know if HHMMSS can be appended also. I have searched the Internet for a naming convention that is very flexible and unique. When files are imported you do not know and can not say what is in each file, so better renaming and IPTC editing should be support post import. I do not consider preImport or during the best place to do this in Post Production. I think this can best be accomplished after using Aperture's Great Meta tagging tools.
I have accepted for for the majority of my workflow that Files should only be named once, and only tagged with a different name on an export. (Like appendning some Keywords).
I used to use a Random, Sequentually generated SN, but think now that the Date is nearly Unique (execpt Multiple Shots a second, or from other cameras at the same time.) I think that this SN to be tagged on the photo is more helpful.
Does any body recommend a powerful naming convention for digital image files?
Here is the naming Scheme that I am thinking of.
YYMMDD-HHMMSSx.CR2
X - is 0 through 9 to account for multiple shots a second.
I was planning on using the sequential numbering. This will allow me to easily speak out differenced between multiple files in a stack by speaking the last number which will usually be unique.
I have also thought about YYMMDD-ABC-HHMMSSx.cr2 (ABC would identify the roll) which is similar to what I have used, but I am do not like ABC now because it might mislead me or not mean anything at all.
What is everybody using, and what is working for you???
(and again)
Can you append the HH (Hour) MM (Minute) SS (Second) that the photo was shot to the name while importing?
PowerMac G5 Mac OS X (10.4.3)Yes, after playing with it at home, I confirm the same things. All of the - (dashes) and .(dots) and 'PM' is wasteful, and aperture will not even show that full name.
Naming is a Huge deal. The photos will not be in Aperture forever, and when they come back out during an export or whatever, they should be how you want them. Not having control makes using aperture as a Library for your photos and projects weak.
If Aperture starts at the import from digital media, then naming happens only after that point. If I have to rename files before importing them into aperture then, I have to skip Aperture's first Steps.
When I name photos I have in the past tagged some keywords to the file name. Stacks and smart albums would do great to help, but after I put all of the work into naming and organizing, the filenames can not be set to what I want them to be.
This means that I would still have to use a 3rd party app to organize sort and name all photos then import to aperture.
Now the Stacks functionality, and keyword tagging can be skipped in Aperture. What is left are versions, working with RAW and smart albums. And Spotlight would do just as good of a job searching for photos this way.
I do not want to knock Aperture, but I have to accept that it is a learning curve. I need to learn what it can do for me to save time. And I need to learn what it will not do so that I can stay away from it. I was really looking forword to it being my post production SW.
I would like for something like this.
YYMMDD-HHMMSSx-Grouping Keyword(IPTC)-1 of 7(in stack)
This would use the Keyword and IPTC That you work hard to set in aperture, and also the stacking. Then when they are exported to clients, or for backup, web, or use when aperture has reached EOL, then it can be very useful.
I like that iPhoto has 3rd party extensions to use Keywords to Save in photo or extract from photo to add to iPhoto Library. I am a little more afraid of Aperture because it is a closed system. If a 3rd party app sets keywords to a photo or renames it, what info would be lost in Aperture.
Thanks for all of the thougts. -
Remove Windows 8 users but keep data files
i am preparing upgrade from win8 to 8.1
all users are defaulted at another drive (d drive), not system drive
so i have to remove all the users in order to have upgrade without a need for clean reinstallation
this requirement make me (other system admin in the world) very troublesome
i have to be very careful when dealing user data files.
in official way, the user is deleted and all his files are deleted too.
is it possible to keep the user data files? the files size may be over 2xxG.
i do not wish to copy files before removal of user account. it wastes lots of time.
i tried to rename the user profile folder and remove user registry under profilelist registry.
it seems not Ok. any suggestion please. Thanks.Hi,
As far as I know, if you have changed User folder location to other drive, you may encounter problem during upgrade Windows 8 to Windows 8.1. It is probably upgrade failed.
Actually, I still suggest you make a back up for your User Profile then upgrade your system. Or if Windows upgrade is unnecessary, I would like you keep using Windows 8 to avoid potential problem.
Roger Lu
TechNet Community Support -
I have the latest downloadable version of LR5. It crashed while it was createing 1:1 previews during an import. After a Win7(64) restart, it shows a message that LR needs to quit because it can't read the preview files and it will try to fix it the next time is launches. I get the same message the next and every subsequent time it launches so I can't launch LR at all now.
I get that the preview file got corrupted somehow. Is there some way to fix this problem without building a new catalog?Use Windows Explorer to open the folder containing your catalog. You will see a folder with the extension .lrdata. You need to delete that folder and then start Lightroom again. Lightroom will generate a new previews folder.
-
Error while trying to import a repository data file in the B2Bconsole
I am trying to import a repository data file with the delivery-channel element
and attributes as below. <delivery-channel name="GM_DC" syncReplyMode="none" nonrepudiation-of-origin="false"
nonrepudiation-of-receipt="false" secure-transport="true" confidentiality="false"
routing-proxy="false" document-exchange-name="GM_DE" transport-name="GM"/> I have
modified the WLC.dtd present in the lib\dtd subdirectory of the WebLogic Integration
installation directory to add the syncReplyMode attribute. When i import the xml
file,i get the following error com.bea.b2b.B2BException: ERROR: Invalid document,
file 'C:\DOCUME~1\ADMINI~1\LOCALS~1\Temp\B2B_COV_Export.xml'. at com.bea.b2b.bulkloader.BulkLoader.processDataFile(Unknown
Source) at com.bea.b2b.bulkloader.BulkLoader.processDataFile(Unknown Source) at
com.bea.b2b.bulkloader.BulkLoader.load(Unknown Source)
Please advise,
Thanks in advanceHello Esther,
the problem seems to be that the temp folder of the target Integration Builder system can't be found:
'The system cannot find the path specified
at [..] FileAccess.getTempDirectory([..])'.
You'll experience the same problem if you try a file based import or export within the Integration Builder directly.
I would recommend to continue the search there. You could check if the environment variables (for Windows: TEMP and TMP) of the OS of the system with the target Integration Builder to an existing path. Check also if the WebAs can access this path.
Good luck
Frank
Maybe you are looking for
-
Crystal Report: Save a Crystal Report to BW, no Roles found
Integrating BusinessObjects Enterprise and BW: - Some configuration works have been done - Crystal Report can connect to BW when [Save a Crystal Report to BW], then Connect to BW, Dialog "Save a Crystal Report to BW" show up, but in Roles Description
-
Business Component views in PLSQL
Hello, We are using ADF for developing a new application. We are using Business component views(BCPs) which are written in PLSQL using instead of triggers. . I searched the internet for some examples but I did not find any, all BCPs seem to be in Jav
-
Applying cpu with a physical standby in place - 11g
187242.1 covers this wrt 9i. Is this topic covered for 11g specifics somewhere? Can't google/metal ink it anywhere. Thanks.
-
Passing a variable value from one query to another within a web template
Hi All, I have a web template with two queries. Both queries have the same select option variable called ZSELCUST on the characteristic ZCUSTOMER but because they are different info providors (both infocubes) the customer has to be entered once for
-
Hp office jet 5610 xi all in one, broken glass in scanner
Is it posible to replace the glass in the office jet 5619 xi scanner ? This question was solved. View Solution.