Multiple files as book vs. single large file

I am beginning two projects with a coworker. Both are multi-page publications (large newsletter and an annual report). We've run into the problem before of both needing to be in the same file at the same time and having to go back and forth opening and closing the files over and over, which quickly becomes a huge hassle. We don't have InCopy or any of the software that would allow us to both work on the same file.
So my question is should I create a file for each spread in the publications and then compile them together in a master ID file as a "book" or just deal with it and create two larger files?

I do a magazine every two months
And every article is a separate file and compiled in a book.
I design, and someone else edits.
So I'm all for the Book, it really works.

Similar Messages

  • Exporting Multiple Internal table data to Single Excel file.

    Hello Expert,
      I want to export more than one internal table data from Web Dynpro Application to single Excel file but in such a way that
    each table's data to be get exported to different tabs in (Multiple sheets)  that single excel file.
    So help me in this matter.
    Thank You.
    Varun
    Moderator message: wrong forum, please post again in "Web Dynpro ABAP".
    Edited by: Thomas Zloch on Oct 29, 2010 1:39 PM

    Each table having different sheet in same CSV file .
    A CSV file is a flat file and don't have "Sheets"; you would have to Export to an Excel file, which supports several Sheets in one file.
    Olaf Helper
    [ Blog] [ Xing] [ MVP]

  • Writing multiple port data report into single excel file

    I'm working on a STTE automation of an power management unit in which data's from 6 different channels are received from the unit which has to be tested and then verified and co-related with the digital and analog inputs given by the user. So my question is finally how to write all dese six channel's data to a single excel file with multiple work sheets..... Like channel-1 one in sheet1,ch-2 in sheet2 so on.... (these data's from 6 different are received from 6 different serial communication port.)
     pls guide how to forward in the final report generation for abv mentioned requirement in labview

    As nyc mentioned you will have to use ActiveX if you want to do exactly what you discribed but if you're new to LabVIEW that can be a big step.
    In the VI you posted you write your data to xls file but the fact is that you use the write to text file function, so in the end your file is just a simple text file, and of course Excel can open this type of file.
    Maybe you could have your VI to write txt files and then have another bit of code that would transfert the data in each text file to differnt worksheet in a Excel file.
    Or maybe another option would be to write all your data to the same TDMS file (on channel per port) and then use the TDMS Excel Add in to generate an Excel file from the TDMS.
    Feel free to ask more questions :-o
    When my feet touch the ground each morning the devil thinks "bloody hell... He's up again!"

  • Importing multiple amount columns from a single text file

    I'm sure this question has been addressed many times. I have tried to search for an answer here and other areas, but I have not been able to find a clear answer yet. I am relatively new to HFM and FDM and thus do not have a lot of experience to fall back on. I am primarily a Planning/Essbase person. That being said, here is my question:
    I have a data source (text file) containing two amount columns that I need to load to HFM via FDM. One amount column consists of Average Exchange Rates and the other amount column consists of Ending Exchange Rates. I have been asked to develop a process to load both columns of data to HFM using a single process (one Import Format). I've been told this is possible by writing an Import DataPump script. It seems that I would need to create a temporary record set based on the original source file and modify it so that it contained a duplicate set of records where the first set would be used for the Average Rate and the second set would be used for the Ending Rate. This would be a piece of cake using SQL against a relational source, but that's obviously not the case here. I do have some experience with writing FDM scripts but from an IF... Then... Else... standpoint based on metadata values.
    If there is anyone out there that has time to help me with this, it would be most appreciated.
    Thanks,

    This is relatively easy to achieve with a single import script associated with the Account source field (assuming AverageRate and EndRate are accounts in your application) in your import format.
    Essentially your first amount say AverageRate would be set as the default field for Amount and these values would be loaded as if it were a single value file. For the second value, EndRate you would have to insert the second value directly into the FDM work table which is the temporary table populated when data is imported from a file during the import process. The example code snippet below suld gve you guidance on how this is done
    'Get name of temp import work table
    strWorkTableName = RES.PstrWorkTable
    'Create temp table trial balance recordset
    Set rsAppend = DW.DataAccess.farsTable(strWorkTableName)
    If IsNumeric(EndRateFieldValue Ref Goes Here) Then
              If EndRateFieldValue Ref Goes Here <> 0 Then
                   ' Create a new record, and supply it with its field values
                   rsAppend.AddNew
                   rsAppend.Fields("DataView") = "YTD"
                   rsAppend.Fields("PartitionKey") = RES.PlngLocKey
                   rsAppend.Fields("CatKey") = RES.PlngCatKey
                   rsAppend.Fields("PeriodKey") = RES.PdtePerKey
                   rsAppend.Fields("CalcAcctType") = 9
                   rsAppend.Fields("Account") = "EndRate"
                   rsAppend.Fields("Amount") = EndRateFieldValue Ref
                   rsAppend.Fields("Entity")=DW.Utilities.fParseString(strRecord, 16, 1, ",")
                   rsAppend.Fields("UD1") = DW.Utilities.fParseString(strRecord, 16, 2, ",")
                   rsAppend.Fields("UD2") = DW.Utilities.fParseString(strRecord, 16, 3, ",")
                   rsAppend.Fields("UD3") = DW.Utilities.fParseString(strRecord, 16, 16, ",")
                   'Append the record to the collection
                   rsAppend.Update
              End If
    End If
    'Close recordset
    rsAppend.close
    In addition the return value of this Import Script should be "AverageRate" i.e. name of ht eaccount associated with the first value field. The NZP expression also needs to be put on the Amount field in the import format to ensure that the EndRate Field value is always processed even if the value of AverageRate is zero.

  • Executing multiple sql statements from a single sql file

    Hi, I am Vijay Krishna.
    I want to drop user, drop tablespace, create tablespace and create user from a single executable file or a single sql file. The command should be in sequence. How can we achieve it? Can I anybody help me in this regard. I want this as soon as possible. It's urgent. Kindly post a reply.
    Also, how can we know the oracle home directory from a java program? The problem is we should know the Oracle home directory and use it for creating the tablespace. In the userinterface we will give just for a new database user creation. I will be really thankfull if anybody can help me in this regard.

    It is showing any error messages.
    I will diplay the entire batch file which we are using.
    sqlplus / as sysdba
    drop user examination cascade;
    drop tablespace examination;
    create tablespace examination
    datafile 'C:\oracle\product\10.1.0\oradata\orcl\examination.dbf'
    size 500M autoextend on;
    create user examination identified by examination
    default tablespace examination
    quota unlimited on examination;
    grant connect, resource to examination;
    exit;
    when i run the batch file from the DOS prompt it is entering into the sql prompt and coming out in a fraction of a second. We are just seeing a screen coming and going. But no error messages are being displayed.
    first we thought that as we are giving the create tablespace and create user in the same file we created another file and tried without having the create commands. Even then the user didn't get dropped.

  • File adapter  reading and writing large files

    Hi we are getting error when trying to process large files using file adapters. files of size 80 to 100 MB. we need to read the inbound files and write them to another folder in another server. the error we are getting is out of memory. gracias

    Hi,
    Use the asynchronous process or a checkpoint(); to see your instance before it time-out.
    --Khaleel                                                                                                                                                                                                                           

  • How to spool out put of multiple scripts and get a single spool file output

    Hi,
    I have one master script that calls three other scripts. The three scripts each produce their own spool files. But I would like to have the master script also produce one single output (in addition to the three indiviual output I mean). How to do that? Can you please help.
    Following are the scripts:
    --m.sql (master script)
    spool c:\m.log
    @1.sql
    @2.sql
    @3.sql
    spool off
    --1.sql
    spool c:\1.log
    insert into test values(1);
    commit;
    spool off
    --2.sql
    spool c:\2.log
    insert into test values(2);
    commit;
    spool off
    spool c:\3.log
    insert into test values(3);
    commit;
    spool off
    --table used
    SQL> desc test
    Name                                      Null?    Type
    A                                                  NUMBERWhen I run the above script m.sql it does produce the other 3 log files (1.log,2.log etc) but m.log (which is master log file which should have output of each of the three calling script) is empty file with 0 byte!
    Thanks
    Edited by: orausern on May 1, 2011 3:17 AM

    I have one master script that calls three other scripts. The three scripts each produce their own spool files. But I would like to have the master script also produce one single output (in addition to the three indiviual output I mean). How to do that? Can you please help. Not sure if that's possible directly with sqlplus spool option
    When you spool to a different file in a single session, sqlplus stops writing to earlier spool file and redirects the output to the file specified in last spool command.
    at the end of the script, however, below may help
    host type c:\1.log >> c:\m.log
    host type c:\2.log >> c:\m.log
    host type c:\3.log >> c:\m.log

  • Syncing Multiple Text Boxes in a Single Indesign File

    In theory this problem seems rather simple and there should be a simple way to accomplish my task, but every search I've tried for syncing text only explains how to thread text boxes.
    What I need to do is layout a template that is 4UP on the page. Currently, whenever I make any edits to my text I have to copy the change multiple times to reflect the change in each instance of my layout on the page.
    What I'd like to do is sync all my textboxes so that any changes or updates will automatically be reflected in each instance of the layout. In the past I've accomplished this in Illustrator using symbols and instances a symbol, but I've yet to find a simlar method in Indesign.
    Any insight would be greatly appreciated.

    zslash64 wrote:
    I had stumbled cross referencing, but didn't understand I needed a  seperate reference for each paragraph within the text box. This worked  pretty well, however when making changes to the original reference  sources, I found the cross reference would break so by time I updated  and re-linked each reference in every text box it ended up taking more  time than simply copying the original text box 3 times. I guess I'm looking for more of a dynamic solution.
    Thanks for the help!
    Zach Barner
    Hi, Zach:
    Thanks for the feedback. You shouldn't have to relink or update multiple changed cross-references in a document individually.
    I did a test on my suggestion, because x-refs aren't supposed to break completely when source context is changed. I saw that I didn't caution you about deleting the cross-reference marker (AKA "Text Anchor"); when you create a cross-reference to source material, in the same or different documents, InDesign inserts a cross-reference marker at the beginning of the source paragraph. If it's not visible, enable Type > Show Hidden Characters. It looks like a colon character (:) in the layer's indicator color.
    NOTE: InDesign uses the term "Destination" to describe the paragraph that a cross-reference pulls into the cursor position when you insert a cross-reference, and it uses the term "Source" to describe the pulled-in destination paragraph that's displayed at the position where the cross-reference is inserted. This usage derives from the terms used for InDesign hyperlinks, where it makes better sense. Depending on your pre-InDesign experience with cross-references in longhand, typewritten, or other computer applications, you may find it, as I do, counterintuitive. I think of "source" as what you pull in, "destination" as the place where you display what's pulled in, and "reference" as the thing that's pulled in. I also use the term "source document" to describe the document that contains the reference's source, and I use the term "container document" to describe the document that contains the reference (the stuff that's pulled in.)
    So, when you change a cross-reference's source material in the source document, if you take care not to delete the cross-reference markers/text anchors that are created at the beginning of source paragraphs, the references in the container document's Hyperlinks/Cross-references panel display a yellow warning triangle for each changed source, and you can update them in one action.
    Updating the changed cross-references can also be confusing: If no changed (yellow triangle) cross-references are selected in the Hyperlinks / Cross-References panel are selected (highlighted,) whether you use the Update cross-references button at the bottom of the Hyperlinks / Cross-References panel, or the Update Cross-Reference item on the Hyperlinks / Cross-References panel's flyout menu, ALL changed cross-references are updated. However, if one or more changed cross-references in the panel are selected, the update button or menu item updates only the selected references.
    See if this helps simplify your operations.
    HTH
    Regards,
    Peter
    Peter Gold
    KnowHow ProServices

  • Sample editor display jumps to file begin on zoom with large files

    Hey - does anyone else have this problem:
    When I zoom in to the max possible zoom level in the sample editor in Logic (eg to edit single samples with the pencil), in any audio file longer than 12:41, the waveform display suddenly jumps to very beginning of the file.
    It happens using either the zoom-in key, zoom slider or zoom tool. It is somewhat infuriating because I have to guess when to stop pressing zoom-in to get as close as i can without triggering the jump. (If i go one zoom level too far, I have to go back, zoom out and re-find my place and try again.)
    I did some investigation and the bug in zoom behavior starts happening with audio files a little shy of 12 min 41 sec ( 12:40.871ms to be more exact).
    Here are the results in "length in samples" of a test audio file (AIFF 24-bit Stereo, 44100Hz):
    33554453 samples and greater => sample editor jumps to beginning when zoomed in to max
    33554432 - 33554452 samples => sample editor jumps to END of file when zoomed to max (bizarre, eh? a 20-sample window in which the bug works in the OPPOSITE direction!)
    33554431 samples and less => sample editor zoom is normal and zooms in perfectly to the proper location at max zoom.
    I also tested other things like trashing my logic prefs and starting from an empty song with nothing in it - none of which make any difference. This bug is present in Logic 8.0.2 on both my Macbook Pro with OS X 10.5.7 and my Powerbook G4 with 10.4.11
    Maybe time to report this to apple - can anyone corroborate by just continually pressing your zoom-in key with the cursor in the middle of an audio file longer than 12:41 and see if the display jumps to the beginning?
    Thanks!

    bump?

  • How to convert vob files back in to single dvd file

    I was ripping a dvd to my macbook and originally, it came up as just a single file for DVD Player. I then decided that i would set the default media player to VLC. When I did this, the file broke down in to several VOB files by itself. Is there anyway i can convert them back in to a single file even if it is only playable on the DVD player? Thanks for any help.

    Put them all back in a TS-Folder and burn that.

  • Building webservices from multiple ejbs in a single ejbjar file

    Hi,
    I have multiple ejbs packaged in a single ejbjar file. How to generate webservices for each ejb from this ejbjar file.
    I will really appriciate if anybody can give some guidance on this.
    Regards
    Rao Peddi
    email: [email protected]

    Assuming you are using servicegen ant task, you can firstly start by using a separate <service> element for each web service you are generating. And within each <service> element you would then specify ejbJar attribute to point at your single ejbjar alongwith includeEJBs attribute which points to the specific EJB in this ejbjar file (note you would specify the <ejb-name> value of the specific ejb present in ejb-jar.xml inside your ejbjar file).
    Note that the individual services would be referred to by their specific serviceUri.
    The following is a simple example using ejbJar inside a single <service> element.
    <servicegen destEar="ears/myWebService.ear" warName="myWAR.war" contextURI="web_services" > <service ejbJar="jars/myEJB.jar" targetNamespace="http://www.bea.com/examples/Trader" serviceName="TraderService" serviceURI="/TraderService" generateTypes="True" expandMethods="True" > </service> </servicegen>
    Thanks
    Shridhar

  • Network speed affected by large file copy operations. Also, why intermittent network outages?

    Hi
    I have a couple of issues on our company network.
    The first is thate a single large file copy imapcts the entire network and dramatically reduces network speed and the second is that there are periodic outages where file open/close/save operations may appear to hang, and also where programs that rely on
    network connectivity e.g. email, appear to hang. It is as though the PC loses it's connection to the network, but the status of the network icon does not change. For the second issue if we wait the program will respond but the wait period can be up to 1min.
    The downside of this is that this affects Access databases on our server so that when an 'outage' occurs the Access client cannot recover and hangs permamnently.
    We have a Windows Active Directory domain that comprises Windows 2003 R2 (soon to be decommissioned), Windows Server 2008 Standard and Windows Server 2012 R2 Standard domain controllers. There are two member servers: A file server running Windows 2008 Storage
    Server and a remote access server (which also runs WSUS) running Windows Server 2012 Standard. The clients comprise about 35 Win7 PC's and 1 Vista PC.
    When I copy or move a large file from the 2008 Storage Server to my Win7 client other staff experience massive slowdowns when accessing the network. Recently I was moving several files from the Storage Server to my local drive. The files comprised pairs
    (e.g. folo76t5.pmm and folo76t5.pmi), one of which is less than 1MB and the other varies between 1.5 - 1.9GB. I was moving two files at a time so the total file size for each operation was just under 2GB.
    While the file move operation was taking place a colleague was trying to open a 36k Excel file. After waiting 3mins he asked me for help. I did some tests and noticed that when I was not copying large files he could open the Excel file immediately. When
    I started copying more data from the Storage Server to my local drive it took several minutes before his PC could open the Excel file.
    I also noticed on my Win7 client that our email client (Pegasus Mail), which was the only application I had open at the time would hang when the move operation was started and it would take at least a minute for it to start responding.
    Ordinarlily we work with many files
    Anyone have any suggestions, please? This is something that is affecting all clients. I can't carry out file maintenance on large files during normal work hours if network speed is going to be so badly impacted.
    I'm still working on the intermittent network outages (the second issue), but if anyone has any suggestions about what may be causing this I would be grateful if you could share them.
    Thanks

    What have you checked for resource usage during one of these copies of a large file?
    At a minimum I would check Task Manager>Resource Monitor.  In particular check the disk and network usage.  Also, look at RAM and CPU while the copy is taking place.
    What RAID level is there on the file server?
    There are many possible areas that could be causing your problem(s).  And it could be more than one thing.  Start by checking these things.  And go from there.
    Hi, JohnB352
    Thanks for the suggestions. I have monitored the server and can see that the memory is nearly maxed out with a lot of hard faults (varies between several hundred to several thousand), recorded during normal usage. The Disk and CPU seem normal.
    I'm going to replace the RAM and double it up to 12GB.
    Thanks! This may help with some other issues we are having. I'll post back after it has been done.
    [Edit]
    Forgot to mention: there are 6 drives in the server. 2 for the OS (Mirrored RAID 1) and 4 for the data (Striped RAID 5).

  • Printing problem with ads on as java when printing large files

    hi all
    we have an wd for java application running on as java 7.0 sp18 and adobe document service if we print
    small files everything works fine with large files it fails with the following error (after arround 2 minutes)
    any ideas
    #1.5^H#869A6C5E590200710000092C000B20D000046E16922042E2#1246943126766#com.sap.engine.services.servlets_js
    p.server.HttpHandlerImpl#sap.com/tcwddispwda#com.sap.engine.services.servlets_jsp.server.HttpHandlerImp
    l#KRATHHO#8929##sabad19023_CHP_5307351#KRATHHO#63a60b106ab311de9cb4869a6c5e5902#SAPEngine_Application_Thr
    ead[impl:3]_15##0#0#Error#1#/System/Server/WebRequests#Plain###application [webdynpro/dispatcher] Process
    ing HTTP request to servlet [dispatcher] finished with error.^M
    The error is: com.sap.tc.webdynpro.clientserver.adobe.pdfdocument.base.core.PDFDocumentRuntimeException:
    Failed to  UPDATEDATAINPDF^M
    Exception id: [869A6C5E590200710000092A000B20D000046E1692201472]#

    Hello
    on which support package level is the java stack  ?
    kr,
    andreas

  • Flash media server taking forever to load large files

    We purchased FMIS and we are encoding large 15+ hour MP4 recordings using flash media encoder. When opening these large files for playback, which have not been opened recently  the player displays the loading indicator for up to 4 minutes! Once it has apparently been cached on the server it opens immediately from any browser even after clearing local browser cache. So a few questions for the experts
    1. Why is it taking so long to load the file. Is it because the MP4 metadata is in the wrong format and the file is so huge? I read somewhere that Media Encoder records with incorrect MP4 metadata is that still the case?
    2. Once its cached on the server, exactly how much of it is cached. Some of these files are larger than 500mb.
    3. What fms settings do you suggest I change. FMIS is running on windows server R2 64 bit, but FMIS itself is 32 bit. We have not upgraded to the 64 bit version. We have 8GB of ram. Is it OK to set FMS cache to 3GB. And would that only have enough room for 3-4 large files, because we have hundreds of them.
    best,
    Tuviah
    Lead programmer, solid state logic inc

    Hi Tuviah,
    You may want to email me offline about more questions here as it can get a little specific but I'll hit the general problems here.
    MP4 is a fine format, and I won't speak ill of it, but it does have weaknesses.  In FMS implementation those weaknesses tend to manifest around the combination of recording and very large files, so some of these things are a known issue.
    The problem is that MP4 recording is achieved through what's called MP4 fragmentation.  It's a part of the MP4 spec that not every vendor supports, but has a very particular purpose, namely the ability to continually grow an MP4 style file efficiently.  Without fragments one has the problem that a large file must be constantly rewritten as a whole for updating the MOOV box (index of files) - fragments allow simple appending.  In other words it's tricky to make mp4 recording scalable (like for a server ) and still have the basic MP4 format - so fragments.
    There's a tradeoff to this however, in that the index of the file is broken up over the whole file.  Also likely these large files are tucked away on a NAS for you or something similar.  Normal as you likely can't store all of them locally.  However that has the bad combo of needing to index the file (touching parts of the whole thing) and doing network reads to do it.  This is likely the cause of the long delay you're facing - here are some things you can do to help.
    1. Post process the F4V/MP4 files into non fragmented format - this may help significantly in load time, though it could still be considered slow it should increase in speed.  Cheap to try it out on a few files. (F4V and MP4 are the same thing for this purpose - so don't worry about the tool naming)
    http://www.adobe.com/products/flashmediaserver/tool_downloads/
    2. Alternatively this is why we created the raw: format.  For long recording mp4 is just unideal and raw format solves many of the problems involved in doing this kind of recording.  Check it out
    http://help.adobe.com/en_US/flashmediaserver/devguide/WSecdb3a64785bec8751534fae12a16ad027 7-8000.html
    3. You may also want to check out FMS HTTP Dynamic Streaming - it also solves this problem, along with others like content protection and DVR and it's our most recent offering in tech, so it has a lot of strengths the other areas don't.
    http://www.adobe.com/products/httpdynamicstreaming/
    Hope that helps,
    Asa

  • How Can I Convert Multiple TextEdit Files Into A Single Giant File?

    For example. I have multiple chapters for my new book each saved as individual files, and I want to combine all the chapters into one single continual file that I can work with in either Text Edit or Word. How can I do this? It's hundreds of files... ∆
    Thanks!

    Hi Bruce, Tex-Edit Plus for OS X...
    http://www.tex-edit.com/
    It can Merge files then Sort them easily, even has a remove duplicate lines script available.
    If the version you get doesn't have the sort perchance, here's a link, put sort in the search scripts box, get the second one, Sort Selection...
    http://dougscripts.com/texedit/

Maybe you are looking for

  • Cannot create file "/Applications/ " Permission denied.

    When installing new software Iget the following error message: "Cannot create file "/Applications/ ". Permission denied. I have used disk utilities to try to repair my permissions with no sucsess. I have also tried reinstaling the software. Any advis

  • Recently imported photos lost (?)

    This afternoon, I imported 500 photos from my iPhone 5 into iPhoto ('11, v 9.4.2). After the photos were imported, iPhoto asked me if I wanted to keep the photos on my iPhone or delete them. I selected "delete" (which the program did). iPhoto subsequ

  • Help with Windows 7 native scanning "Scan Profiles"

    Hey everyone, I've got a problem I just can't seem to find the answer for.  We have a couple workstations in our environment (Win7 Pro 64-bit) that are going to be used for card scanning--specifically the Fujitsu Flatbed fi-60F. There are no issues w

  • No planning of Stock Transfer Orders

    Hi, I have business scenario in which we transfer the material from production plant to the sales Depot using Stock Transfer Order created manually.  We do not want include these Stock transfer orders in planning run. could any one reply on this. reg

  • Job Scheduling Management

    Hi Experts, Did any one have guide to schedule a job in solman 7.1 using Job Scheduling Management? I have created a job request from solman_workcenter job management. Now I want to know how to schedule the job in backend system. As I am not able fin