FinalCutPro XML files are created in SmartScorex?
I am trying to get a correctly done import of a musical printable and Playable .xml score into Logic Pro 9 and I am scanning from pdf to xml in Musitek's SmartScoreX, which saves the file as a FinalCutPro XML as seen here http://img.skitch.com/20090919-824d2t73nncnegk9b7pw73jpxs.jpg
There are more options to export-maybe I am using the wrong type, these are the options
http://img.skitch.com/20090919-riwsn4g9x4h1umpmcmwcark9e7.jpg
Please if anyone has knowledge of the moving scores including graphics and sound[=xml?] from FinalCutPro to Logic Pro 9 help me figure this out, like what are is the right type file from that list of options above?
Thanks for helping.
You should ask this in the Logic forum as that is the target.
http://discussions.apple.com/forum.jspa?forumID=1201
That said, according to the Logic Manual found online:
*To import Final Cut Pro XML files*
Do one of the following:
1) Choose File > Import, then choose the file in the Import dialog.
2) Locate and select the file in the Browser, then click the Open button.
The XML import procedure allows you to change or retain the sample rate of audio files used in your Final Cut Pro sequences. If you import sequences that use audio files with different sample rates, you are given the following options:
To alter the sample rate of your Logic Pro project to match all imported Final Cut Pro sequence audio files.
To retain the sample rate of your Logic Pro project. All Final Cut Pro sequence audio files that use a sample rate that differs from the selected one are converted.
Note: A Final Cut Pro sequence is an arrangement of video, audio, and graphics clips, edit information, and effects. When combined, they create a movie. Use of XML to import Final Cut Pro sequences into Logic Pro allows you to exchange multiple audio tracks, with all positional region information, region names, and volume and pan automation data, retained.
Similar Messages
-
Full sync reconciliation - no xml file is created
Hi,
I am using Oracle VM Templates for PeopleSoft and OIM
I followed exactly the guide here http://download.oracle.com/docs/cd/E11223_01/doc.910/e11205/deploy.htm#BIHEBCCI
But my full data publish is not working. no xml file is created in the file node.
Under Enterprise Components >Integration Definitions >Initiate Processes >Full Data Publish I can start a request for PERSON_BASIC_FULLSYNC service operation.
The Run Status is success but distribution status hangs on posting for any of the async-one way messages.
Anyone had issues with this before?
There is a related post here Re: PEOPLE_BASIC_FULLSYNC file export to a file node not working
Under Monitoring>Asynchronous services all of the messages are set to status new.
However synchronous messages/service opertaions are processing successfully via people-code
Please help
Thanks.Issue solved,
see the related post.
You have to purge the domain settings in PIA.
PIA-> PeopleTools-> Integration Broker->Service Operations Monitor->Administration->Domain Status -
XI to read the action xml file and create a SAP notification in PM
Hi All
I am new to XI world can you please help me in doing this scenario.
I have to read an XML file and create a Notification in PM module of SAP.
Step by Step help would be great.
Thanks in Advance
SaiHi Sai,
To send data from XML file to SAP (any module) there are 2 ways..
1. File to IDoc and
2. File to RFC...
first Identify the concerned BAPI or IDOC for CREATION OF
NOTIFICATION ...then do the scenarios..
for Step by stp process go through this link...
New to XI
regards,
Ansar. -
Why multiple log files are created while using transaction in berkeley db
we are using berkeleydb java edition db base api, we have already read/write CDRFile of 9 lack rows with transaction and
without transaction implementing secondary database concept the issues we are getting are as follows:-
with transaction----------size of database environment 1.63gb which is due to no. of log files created each of 10 mb.
without transaction-------size of database environment 588mb and here only one log file is created which is of 10mb. so we want to know how REASON CONCRETE CONCLUSION ..
how log files are created and what is meant of using transaction and not using transaction in db environment and what are this db files db.001,db.002,_db.003,_db.004,__db.005 and log files like log.0000000001.....plz reply soonwe are using berkeleydb java edition db base api, If you are seeing __db.NNN files in your environment root directory, these are environment's shared region files. And since you see these you are using Berkeley DB Core (with the Java/JNI Base API), not Berkeley DB Java Edition.
with transaction ...
without transaction ...First of all, do you need transactions or not? Review the documentation section called "Why transactions?" in the Berkeley DB Programmer's Reference Guide.
without transaction-------size of database environment 588mb and here only one log file is created which is of 10mb.There should be no logs created when transactions are not used. That single log file has likely remained there from the previous transactional run.
how log files are created and what is meant of using transaction and not using transaction in db environment and what are this db files db.001,db.002,_db.003,_db.004,__db.005 and log files like log.0000000001Have you reviewed the basic documentations references for Berkeley DB Core?
- Berkeley DB Programmer's Reference Guide
in particular sections: The Berkeley DB products, Shared memory regions, Chapter 11. Berkeley DB Transactional Data Store Applications, Chapter 17. The Logging Subsystem.
- Getting Started with Berkeley DB (Java API Guide) and Getting Started with Berkeley DB Transaction Processing (Java API Guide).
If so, you would have had the answers to these questions; the __db.NNN files are the environment shared region files needed by the environment's subsystems (transaction, locking, logging, memory pool buffer, mutexes), and the log.MMMMMMMMMM are the log files needed for recoverability and created when running with transactions.
--Andrei -
recently loads of .tmp files are created and left on exit in TEMP folder. CC Cleaner cleans them but takes time because so many - why?
== found when running CC Cleaner and it took a longtimerecently loads of .tmp files are created and left on exit in TEMP folder. CC Cleaner cleans them but takes time because so many - why?
== found when running CC Cleaner and it took a longtime -
The attached VI works when I run it initially creating the XML file, but when I change data and run after the XML file is created, it's not updated with new data. What am I doing wrong?
Thank you.
Solved!
Go to Solution.
Attachments:
Attractive Force XML.vi 23 KBThat looks like the logic which was written into the VI
Inner true/ false case:
(file exists : false) write to XML file, unflatten for display
(file exists : true) read from XML file, unflatten for display
At no time when the file exists (true case) is the XML file updated... -
Why some extra XML tags are created when viewed a XML in Adobe Illustrator with Esko plugin?
We see some extra XML tags are created when viewing a XML in Adobe Illustrator with Esko plugin. Why these extra lines are created ?
Screenshot of XML and XML viewed in AI below(the extra XML tags outlined in red):Hi Graffiti,
Thanks so much for your resply. I am happy to say that the problem has been fixed by my customer updating her version of Adobe Reader 9 to the latest version. I had sent her a few suggestions on how to try to fix the problem including that one from info I had read on the forum etc.. and that was the solution that worked. I am so relieved that I was able to fix the problem and to answer your question the blank pages were only graphics no text but they were the only pages that the graphics took up the whole page. I thought that was strange and had her change a setting in preferences for printing large images a solution I read in the forum but it did not work. -
Compressor is not working. When I export to compressor from FCP to encode DVD, FCP freezes and Compressor appears to be working but the progress bar never fills no files are created. I'm using a MacBook Pro 10.6.7 version 2.66 GHz core 2 Duo 4GB 1067 MH. I'm not new at this. I re-installed FCP successfully. I trashed the preferences. Please help.
Do you have any filters applied to your clips?
This happened to me before, I accidently had applied a broadcast safe filter to a color matte that was black....
I figured it out by duplicating my sequence and deleting items from my timeline 1 by 1 until I was able to export a reference movie.
Anywho, export again, this time with "self contained" movie selected.
Then try to import to Compressor. -
SHADOW_IMPORT_UPG1 is very very slow, no log files are created
Hi all
We are now doing our production upgrade, during the SHADOW_IMPORT_UPG1 phase system is very slow
no log files are created in the /usr/sap/put/log directory.
only three files are growing in /usr/sap/tmp directory
orar3p> ls -lrt
total 219176
-rw-rw-rw- 1 r3padm sapsys 2693 Aug 15 18:42 UCMIG_DE.ECO
-rw-rw-rw- 1 r3padm sapsys 2374 Aug 15 18:42 R3trans.out
-rw-rw-rw- 1 r3padm sapsys 2685 Aug 15 18:46 ADDON_TR.ECO
-rw-rw-rw- 1 r3padm sapsys 726 Aug 15 20:04 crshdusr.log
-rw-rw-rw- 1 r3padm sapsys 3915 Aug 15 21:53 EU_IMTSK.ECO
-rw-rw-r-- 1 r3padm sapsys 257 Aug 15 22:09 SAPKKLFRN18.R3P
-rw-rw-r-- 1 r3padm sapsys 257 Aug 15 22:09 SAPKKLPTN18.R3P
-rw-rw-r-- 1 r3padm sapsys 257 Aug 15 22:09 SAPKKLESN18.R3P
-rw-rw-r-- 1 r3padm sapsys 36433272 Aug 15 23:44 SAPKLESN18.R3P
-rw-rw-r-- 1 r3padm sapsys 36807577 Aug 15 23:44 SAPKLFRN18.R3P
-rw-rw-r-- 1 r3padm sapsys 35372350 Aug 15 23:44 SAPKLPTN18.R3P
orar3p> date
Fri Aug 15 23:44:54 PDT 2008
Can anyone help what to do
Thanks
SenthilHello,
did you discover what the cause was for this phase running so slow? And how long did it take to complete in the end?
We are currently running an upgrade of our Development system and have struck the same issue.
I killed the upgrade after the phase had been running for 4 hours and restarted it, but it looks like it is still going to run for a long time.
Regards....John -
Webdav and xdb, xml-files are automatically deleted (or hidden)
Hi All.
In our project we have mounted webdav from a Linux box against an IBM database server running AIX and Oracle 11g.
The file system on the Linux box is mounted to the xdb-schema on the dbserver.
When placing xml-files into the database through the webdav-catalogue on the Linux-box the files are copied over, but almost immediatly removed (or hidden) from the target directory in the database.
I´m thinking there might be a trigger that tries to validate the xml-file against an xsd that isn´t registered in XDB. The reason for this is that when suffixing these files with something other than .xml the files are kept visible to all users.
What I´d like to know is how to disable this check/trigger, and which trigger does this.
Can anybody tell me if my assumtion is correct, and if yes, how to disable this checking?
Our users in the project will also be given access to folders through this webdav mount, and they will use this as a storage space for other xml-files as well. Files we do not have xsd´s for. Another function for this webdav directory is to serve as a source directory for ODI, and ODI validates the files against the xsd when transferring data to the database, so we don´t need this extra validation in the database.
Thanks,
BobHi,
What's the database version? (select * from v$version)
When placing xml-files into the database through the webdav-catalogue on the Linux-box the files are copied over, but almost immediatly removed (or hidden) Does that mean you see them for a short period of time, and then they disappear, or you never see them at all?
Are you using XML DB events?
You can check if there's any resource configuration defined, and that may explain this behaviour :
select x.*
from xdb.xdb$resconfig rc
, xmltable(
xmlnamespaces(default 'http://xmlns.oracle.com/xdb/XDBResConfig.xsd')
, '/ResConfig/event-listeners/listener'
passing rc.object_value
columns description varchar2(300) path 'description'
, schema varchar2(30) path 'schema'
, source varchar2(30) path 'source'
, events varchar2(300) path '(#ora:xq_proc#){string-join(for $i in events/child::* return name($i), ", ")}'
, condition varchar2(300) path 'pre-condition/existsNode/XPath'
) x
DESCRIPTION SCHEMA SOURCE EVENTS CONDITION
Register event handlers for users. SYS DBMS_XS_PRINCIPAL_EVENTS_INT Pre-Create, Post-Create, Pre-Update, Pre-Delete /r:Resource[r:SchemaElement="http://xmlns.oracle.com/xs/principal.xsd#user"]
Register event handlers for role sets. SYS DBMS_XS_ROLESET_EVENTS_INT Pre-Create, Post-Create, Pre-Update, Pre-Delete /r:Resource[r:SchemaElement="http://xmlns.oracle.com/xs/roleset.xsd#roleSet"]
Register event handlers for roles. SYS DBMS_XS_PRINCIPAL_EVENTS_INT Pre-Create, Post-Create, Pre-Update, Pre-Delete /r:Resource[r:SchemaElement="http://xmlns.oracle.com/xs/principal.xsd#dynamicRol
Register event handlers for dynamic roles. SYS DBMS_XS_PRINCIPAL_EVENTS_INT Pre-Create, Post-Create, Pre-Update, Pre-Delete /r:Resource[r:SchemaElement="http://xmlns.oracle.com/xs/principal.xsd#role"]
Register event handlers for function roles. SYS DBMS_XS_PRINCIPAL_EVENTS_INT Pre-Create, Post-Create, Pre-Update, Pre-Delete /r:Resource[r:SchemaElement="http://xmlns.oracle.com/xs/principal.xsd#functionRo
Register event handlers for Data Security. SYS DBMS_XS_DATA_SECURITY_EVENTS Post-Update, Post-Delete /r:Resource[r:SchemaElement="http://xmlns.oracle.com/xs/dataSecurity.xsd#DataSec
Register event handlers for Security Classes. SYS DBMS_XS_SECCLASS_EVENTS Pre-Update, Pre-Delete /r:Resource[r:SchemaElement="http://xmlns.oracle.com/xs/securityclass.xsd#securi
SYS DBMS_NETWORK_ACL_ADMIN Pre-Delete
PL/SQL Network ACL Resource Configuration
Handling of Office Open XML spreadsheets OOX OOX_SML_STORE Pre-Create, Pre-Delete /Resource[ContentType="application/vnd.openxmlformats-officedocument.spreadsheet
9 rows selected
{code}
And if your target folder has any config file associated, for example :
{code}
SQL> select *
2 from table(dbms_resconfig.getResConfigPaths('/office/excel/docs'));
COLUMN_VALUE
/office/excel/conf/sml_rescfg.xml
{code} -
Ok so each character rolled with my app gets saved as an html file in a folder i create on the sdcard called RpgApp
the code to do this is
private async Task saveToHtmlCharacter(string name)
StorageFolder externalDevices = Windows.Storage.KnownFolders.RemovableDevices;
StorageFolder sdcard = (await externalDevices.GetFoldersAsync()).FirstOrDefault(); //Windows.Storage.ApplicationData.Current.LocalFolder;
var dataFolder = await sdcard.CreateFolderAsync("RpgApp",
CreationCollisionOption.OpenIfExists);
var file = await dataFolder.CreateFileAsync(name+".html",
CreationCollisionOption.ReplaceExisting);
using (StreamWriter writer = new StreamWriter(await file.OpenStreamForWriteAsync()))
writer.WriteLine("<html><head><title>Character File for " + z.charNameFirst + " " + z.charNameSecond + "</title></head><body>");
//trimed for the post
now this as i say works perfectly and i can confirm that the files show up in the "RpgApp"folder
now the next step is to have the app read the names of each of those html files and create a seperate button for each one named for the filename and with the filename as content (later i will link them up to load the html file they are named for in a webbrowser
panal)
here is the code that *should* do that
private async Task readFiles()
z.test.Clear();
StorageFolder externalDevices = Windows.Storage.KnownFolders.RemovableDevices;
StorageFolder folder = (await externalDevices.GetFolderAsync("RpgApp"));
IReadOnlyList<StorageFile> fileList = await folder.GetFilesAsync();
//z.test.Add("dummy");
foreach (StorageFile file in fileList)
z.test.Add(file.Name);
public async void buttonTest()
await readFiles();
CoreDispatcher dispatcher = CoreWindow.GetForCurrentThread().Dispatcher;
await dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
foreach (string name in z.test)
Button button1 = new Button();
button1.Height = 75;
button1.Content = name;
button1.Name = name;
testStackPanal.Children.Add(button1);
now i say should as what i get is an error of "A first chance exception of type 'System.UnauthorizedAccessException' occurred in mscorlib.ni.dll" (2 of them in fact one after another)
more detailed error is at http://pastebin.com/3hBaZLQ9
now i went through checking line after line to find the error and line that causes it is:
StorageFolder folder = (await externalDevices.GetFolderAsync("RpgApp"));
in the readFiles method
i checked to make sure the case is right etc and its spot on, that IS the folder name, and remember my app creates that folder and creates the files that are inside that folder (and only files my app creates are in that folder) so it should have access
im 100% stuck its taken me all day to get the files to save and now i cant get them to list correctly
I have tested the button creation function with fake list data and that worked fine, as i say the error is when i tell it to look at the folder it created
all help most greatfully recievedthis was actually solved for me on another forum thread
async Task<bool> continueToReadAllFiles(StorageFolder folder)
if (folder == null)
return false;
if (folder.Name.Equals("RpgApp", StringComparison.CurrentCultureIgnoreCase))
var files = await folder.GetFilesAsync();
foreach (var file in files)
test.Add(file.Name);
return false;
var folders = await folder.GetFoldersAsync();
foreach (var child in folders)
if (!await continueToReadAllFiles(child))
return false;
return true;
public async void buttonTest()
test.Clear();
StorageFolder externalDevices = Windows.Storage.KnownFolders.RemovableDevices;
var folders = await externalDevices.GetFoldersAsync();
foreach (var folder in folders)
if (!await continueToReadAllFiles(folder))
break;
//.... commented out
...now i say solved this is more a workaround...but it works.
I would love to know why we cant just scan the RpgApp folder instead of having to scan the whole dc card and disregard all thats not in the RpgApp folder -
How to Parse the XML File and create an IDOC?
Hello friends,
I've an xml file which needs to be parsed and create an idoc into SAP to post the New Hire process? I need to create an ABAP for this
Could somebody help me do this?
Thankshere is the sample code for loading local xml file and parsing its using the abov ementioned FM
report y_xml_upload
no standard page heading.
data: filename type string ,
xmldata type xstring .
data: result_xml type standard table of smum_xmltb .
data: return type standard table of bapiret2 .
constants: line_size type i value 255.
data: begin of xml_tab occurs 0,
raw(line_size) type x,
end of xml_tab,
file type string,
size type i.
* upload the xml file
filename = 'C:raja123.xml' .
call function 'GUI_UPLOAD'
exporting
filename = filename
filetype = 'BIN'
has_field_separator = ' '
header_length = 0
importing
filelength = size
tables
data_tab = xml_tab
exceptions
others = 1.
************uncomment this and comment the call of SCMS_BINARY_TO_XSTRING if you dont have this fm in your system.
* if sy-subrc <> 0.
* clear: xmldata.
* exit.
* else.
* data: len type i.
* len = size.
* loop at xml_tab.
* if len <= line_size. exit. endif.
* concatenate xmldata xml_tab-raw(line_size)
* into xmldata in byte mode.
* len = len - line_size.
* endloop.
* if len > 0.
* concatenate xmldata xml_tab-raw(len)
* into xmldata in byte mode.
* len = len - size.
* endif.
* endif.
******* end of comment.
call function 'SCMS_BINARY_TO_XSTRING'
exporting
input_length = size
* FIRST_LINE = 0
* LAST_LINE = 0
importing
buffer = xmldata
tables
binary_tab = xml_tab
exceptions
failed = 1
others = 2
if sy-subrc <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
endif.
call function 'SMUM_XML_PARSE'
exporting
xml_input = xmldata
tables
xml_table = result_xml
return = return .
Regards
Raja
do not forget to assign points to helpful answers -
How to change the replication group information after db files are created
Since group information is persisted in the database, I am wondering if there is a way to update the information.
We want to implement some kind of Berkeley DB master relay mechanism for our two data centers, which has slow link in between. Basically have one master populating a database file and launch another two node as master to replay that to other nodes of its own group. It will be much efficient this way so we don't have to copy the data multiple times over the slow link.
We periodically (once a day) update the Berkeley DB content from customer's feed on a backend node and upload (rsync) the Berkeley DB File to two the data centers. We would like to have a master node in each data center to read the pre-populated data file, replicate the changes to the web node (read only) while they are still running. I simulated local and if I trick the nodeName and nodeHostPort setting, it should work (basically, fake the replication node on the backend node using tampered hostfile so they get registerred). However, it is not very convenient and definitely a dangerous hack on the production servers.
If there is a way, after the creation, to update the group information (for example, change all the nodes information) without corrupt the log file/replication stream, it will be much easier for us. Basically, we would like to have the node/group information and data file de-coupled?
Any ideas how to do that, or is there a better way to design such a replay of data using Berkeley DB?
Thanks in advance!2. You mentioned to not clean up the log file. Is there a point where I can safely call clean up on the environment when BDB is still online as I can imagine we will run out of space very soon if we don't clean up.The approach outlined above (steps 1 to 5) will ensure that no log files are deleted on A while you are updating B and C. The use of DbBackup ensures this. For more information on how this works, see the DbBackup javadoc.
Whether this causes you to run out of disk space on A is something you'll have to evaluate for yourself. It depends on the write rate on A and how long it takes to do the copy to B and C. If this is a problem, you could make a quick local copy of the environment on A, and then transfer that copy to B/C. But you must prohibit log file deletion during the copy, using DbBackup, or the copy will be invalid.
You should perform explicit JE log cleaning (including a checkpoint) before doing the copy to B/C. This will reduce the number of files that are copied to B/C, and will reduce the likelihood that you'll fill the disk on A. See the javadoc for Environment.cleanLog for details on how to do an explicit log cleaning.
In your earlier post, it sounded like the updates to A were in batch mode -- done all at once at a specific time of day. If so, you can do the copy to B/C after the update to A. In that case, I don't understand why you are afraid of filling the disk on A, since updates would not be occurring during the copy to B/C.
--mark -
XML files are empty when downloaded on my mac
Don't understand why the sepa ST banking XML files appear empty on my mac book pro whilst being perfectly normal on our mac or windows laptop, anyone an idea how solve this ?
Try a safe boot.
Shutdown your machine. Hold down the shift key. Poweron. Wait awhile Wait awhile while you harddrive
is being checked.
http://support.apple.com/kb/ht1455 -
XML files are being read as PBEM game file format!!
Hello,
I am trying to edit some .xml files in Script Editor but everytime I try to open them I am told :
'Script Editor cannot open files in the "PBEM game file" format'
(Double clicking the .xml file opens up the Big Bang game)
Does anyone know how to get round this? also is Script Editor the best application to use? I want to use an editor that has colours.
Many thanks, robScript Editor has one purpose - editing and compiling AppleScript.
I can't account for the wording of the error message, but for sure I wouldn't expect it to handle XML files directly.
You could use AppleScript to manipulate the XML data, but that's probably not ideal either.
What you need is a text editor. BBEdit would be my first choice, or its cheaper (read: free) sibling TextWrangler.
Both can handle any type of text file, including syntax highlighting for many structured files including programming source code.
Maybe you are looking for
-
Is there a way to selectively unhide files in ML?
I restored a folder from my online Crashplan account but some of the files are hidden. Is there a simple way to make these files visisble?
-
I have done all the usual re boots on wifi and iPad, phone and so that is not the problem. I have 3 apps waiting to update bit they are stuck there, I cannot connect to iTunes Store and so you know I am not trying to connect to tunes! I am trying to
-
Extremely urgent!! Unrestoring backup. Please help...
Please help! I had made an old backup on my nokia 7310 and i accidently restored it and now my messages and contacts are lost! Please please tell me a way how to undo it! Thnx.
-
How is a Package/Class located?
Hi, I developing a web app, and as far as I know, if I place my jar files in the web-inf/lib folder, the server/app should be able to find them when used in my classes/servlets. Well, it seems that my web server (WebLogic 6.1) is not able to find som
-
HT4356 How to print a section of document?
How to print a section of a document ?