[CS5.5/6] - XML / Data Merge questions & Best practice.

Fellow Countrymen (and women),
I work as a graphic designer for a large outlet chain retailer which is constantly growing our base of centers.  This growth has brought a workload that used to be manageable with but two people to a never ending sprint with five.  Much of what we do is print, which is not my forte, but is also generally a disorganized, ad-hoc affair into which I am wading to try to help reduce overall strain.
Upon picking up InDesign I noted the power of the simple Data Merge function and have added it to our repetoire in mass merging data sources.  There are some critical failures I see in this as a tool going forward for our purposes, however:
1) Data Merge cannot handle information stored and categorized in a singular column well.  As an example we have centers in many cities, and each center has its own list of specific stores.  Data merge cannot handle a single column, or even multiple column list of these stores very easily and has forced us into some manual operations to concatenate the data into one cell and then, using delimiter characters, find and replace hard returns to seperate them.
2) Data Merge offers no method of alternate alignment of data, or selection by ranges.  That is to say:  I cannot tell Data merge to start at Cell1 in one column, and in another column select say... Cell 42 as the starting point.
3) Data merge only accepts data organized in a very specific, and generally inflexible pattern.
These are just a few limitations.
ON TO MY ACTUAL DILEMMA aka Convert to XML or not?
Recently my coworker has suggested we move toward using XML as a repository / delivery system that helps us quickly get data from our SQL database into a usable form in InDesign. 
I've watched some tutorials on Lynda.com and havent yet seen a clear answer to a very simple question:
"Can XML help to 'merge' large, dynamic, data sets like a list of 200 stores per center over 40 centers based off of a single template file?"
What I've seen is that I would need to manually duplicate pages, linking the correct XML entry as I go rather than the program generating a set of merged pages like that from Data Merge with very little effort on my part.  Perhaps setting up a master page would allow for easy drag and drop fields for my XML data?
I'm not an idiot, I'm simply green with this -- and it's kind of scary because I genuinely want us to proceed forward with the most flexible, reliable, trainable and sustainable solution.  A tall order, I know.  Correct me if I'm wrong, but XML is that beast, no?
Formatting the XML
Currently I'm afraid our XML feed for our centers isnt formatted correctly with the current format looking as such:
<BRANDS>
     <BRAND>
          • BrandID = xxxx
          [Brand Name]
          [Description]
          [WebMoniker]
          <CATEGORIES>
               <CATEGORY>
                    • xmlns = URL
                    • WebMoniker = category_type
          <STORES>
               <STORE>
                    • StoreID = ID#
                    • CenterID = ID#
I dont think this is currently usable because if I wanted to create a list of stores from a particular center, that information is stored as an attribute of the <Store> tag, buried deep within the data, making it impossible to 'drag-n-drop'. 
Not to mention much of the important data is held in attributes rather than text fields which are children of the tag.
Im thinking of proposing the following organizational layout:
<CENTERS>
     <CENTER>
     [Center_name]
     [Center_location]
          <CATEGORIES>
               <CATEGORY>
                    [Category_Type]
                    <BRANDS>
                         <BRAND>
                              [Brand_name]
My thought is that if I have the <CENTER> tag then I can simply drag that into a frame and it will auto populate all of the brands by Category (as organized in the XML) for that center into the frame.
Why is this important?
This is used on multiple documents in different layout styles, and since our store list is ever changes as leases end or begin, over 40 centers this becomes a big hairy monster.  We want this to be as automated as possible, but I'd settle for a significant amount of dragging and dropping as long as it is simple and straightforward.  I have a high tollerance for druding through code and creating work arounds but my co-workers do not.  This needs to be a system that is repeatable and understandable and needs to be able to function whether I'm here or not -- Mainly because I would like to step away from the responsibility of setting it up every time
I'd love to hear your raw, unadulterated thoughts on the subject of Data merge and XML usage to accomplish these sorts of tasks.  What are your best practices and how would you / do you accomplish these operations?
Regards-
Robert

From what I've gleaned through watching Lynda tutorials on the subject is that what I'm hoping to do is indeed possible.
Peter, I dont disagree with you that there is a steep learning curve for me as the instigator / designer of this method for our team, but in terms of my teammates and end-users that will be softened considerably.  Even so I'm used to steep learning curves and the associated frustrations -- but I cope well with new learning and am self taught in many tools and programs.
Flow based XML structures:
It seems as though as long as the initial page is set up correctly using imported XML, individual data records that cascade in a logical fashion can be flowed automatically into new pages.  Basically what you do is to create an XML based layout with the dynamic portion you wish to flow in a single frame, apply paragraph styles to the different tags appropriately and then after deleting unused records, reimport the XML with some specific boxes checked (depending on how you wish to proceed).
From there simply dragging the data root into the frame will cause overset text as it imports all the XML information into the frame.  Assuming that everything is cascaded correctly using auto-flow will cause new pages to be automatically generated with the tags correctly placed in a similar fashion to datamerge -- but far more powerful and flexible. 
The issue then again comes down to data organization in the XML file.  In order to use this method the data must be organized in the same order in which it will be displayed.  For example if I had a Lastname field, and a Firstname field in that order, I could not call the Firstname first without faulting the document using the flow method.  I could, however, still drag and drop content from each tag into the frame and it would populate correctly regardless of the order of appearance in the XML.
Honestly either method would be fantastic for our current set of projects, however the flow method may be particularly useful in jobs that would require more than 40 spreads or simple layouts with huge amounts of data to be merged.

Similar Messages

  • [CS4-CS5] Table from XML: what's the best practice?

    Hi,
    I have to build a huge table (20-25 pages long...) from an XML file.
    No problem with that, I wrote a XSLT file to convert my client's XML in the "Table/Cell structure" InDesign needs with all style parameters.
    The problem is that it takes a long time (4-5 hours) to ID to build the whole table.
    I wonder if this is still the best practice with such a huge amount of data (the input XML is 1,1 Mb).
    I also tried to build the table using a script (JavaScript) but from some time test I can see the problem is even bigger.
    I'm currently using an iMac (Mac OS X 10.6.2) with 3.06 GHZ Intel Core 2 Duo and 8 GB ram, it's not exactly the worst computer in this world...
    Is there a best practice for this kind of work?
    Client is becoming a pain in the arse...
    Thanks in advance!

    First transform the XML through XSLT seprately and then Import that XML in InDesign.
    Hope it help.
    Regards,
    Anil Yadav

  • Data warehousing question/best practices

    I have been given the task of copying a few tables from our production database to a data warehousing database on a once-a-day (overnight) basis. The number of tables will grow over time; currently it is 10. I am interested in not only task success but also best practices. Here's what I've come up with:
    1) drop the table in the destination database.
    2) re-create the destination table from the script provided by SQL Developer when you click on the 'SQL' tab while you're viewing the table.
    3) INSERT INTO the destination table from the source table using a database link. Note: I am not aware of any columns in the tables themselves which could be used to filter added/deleted/modified rows only.
    4) After data import, create primary key and indexes.
    Questions:
    1) SQL Developer included the following lines when generating the table creation script:
    <table creation DDL commands>
    then
    PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
    STORAGE (INITIAL 251658240 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
    TABLESPACE "TBLSPC_PGROW"
    it generated this code snippet for the table, the primary key and every index.
    Is this necessary to include in my code if they are all default values? For example, one of the indexes gets scripted as follows:
    CREATE INDEX "XYZ"."PATIENT_INDEX" ON "XYZ"."PATIENT" ("Patient")
    -- do I need the following four lines?
    PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
    STORAGE(INITIAL 60817408 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
    TABLESPACE "TBLSPC_IGROW"
    2) Anyone with advice on best practices for warehousing data like this, I am very willing to learn from your experience.
    Thanks in advance,
    Carl

    I would strongly suggest not dropping and recreating tables every day.
    The simplest option would be to create a materialized view on the destination database that queries the source database and to do a nightly refresh of that materialized view. You could then create a materialized view log on the source table and then do an incremental refresh of the materialized view.
    You can schedule the refresh of the materialized view either in the materialized view definition, as a separate job, or by creating a refresh group and adding one or more materialized views.
    Justin

  • Question - Best practice data source for Vs2008 and Crystal Reports 2008

    I have posted a question here
    CR2008 using data from .NET data provider (ADO.NET DATASET from a .DLL)
    but think that perhaps I need general community advise on best practice with data sources.
    In Crystal reports I can choose the data source location from any number of connection types, eg ado.net(xml), com, oledb, odbc.
    Now in regard to the post, the reports have all been created in Crxi 6.3, upgraded to Crystal XI and now Im using the latest and greatest. I wrote the Crystal Reports 6.3/ XI reports back in the day to do the following: The Reports use a function from COM Object which returns an ADO recordset which is then consumed fine.
    So I don't want to rewrite all these reports, of which there are many.
    I would like to know if any developers are actually using .NET Class libraries to return ADO.NET datasets via the method call or if you are connecting directly to XML data via whatever source ( disk, web service, http request etc).
    I have not been able to eliminate the problem listed in the post mentioned above, which is that the Crystal Report is calling the .NET class library method twice before displaying the data. I have confirmed this by debugging the class lib.
    So any guidance or tips is appreciated.
    Thanks

    This is already being discuss in one of your other threads. Let's close this one out and concentrate on the one I've already replied to.
    Thanks

  • Data Merge questions

    We are using MS Mail Merge to populate existing forms. We have 250+ different MS Word .docx forms that have different information in different places. We are serving the mail merge data from a database throught a VB.Net program and it's working well. As most probably know, the process is to map mail merge fields onto the form using Word and then populate them through the application based on the form and information the end user needs.
    We want to do the same thing with PDF documents. I've been poking around this site and the internet and am confused about what is available to do this. Data Merge seems to kind of do the same thing, but through files instead of a database. I see some functions in this and other Adobe SDK's, but I'm not clear on which would be the one I'd need. And it would need to be programmable with VB.net.
    Or do I need an aftermarket SDK to do this (hope it's okay to ask that question here)?

    You might want to look into some of the third party catalogue plugins for ID.

  • InDesign Data Merge Question....

    I have a calendar created using the data merge feature within InDesign.  The only question I have thus far is on each day of the week, we currently have a text frame linking to a .txt file that our customers add events to.  How do I link the specific .txt files to the frame?  I cant use the image function in my excel file as it does not reconize it as that.  Is there another way to script/put this in my source file to link and place the text in the .txt files (individual txt files for each day of the year).??  Any help would be appreciated.  Thanks!

    I don't think I really undertand the structure of the data file you have. My sense, though is that this may not really be a data merge project.
    At its heart, Data Merge is is very primitive. It's great for mailing labels, but can get defeated by more complex projects. For the merge to work, all of the data must be structured the same for each record in terms of the number of fields, and the text you want to include needs to be in the fields (as far as I know, you cannot use a pointer to an external text file in a field the same way that you can use a pointer to an image). You have a choice when setting up the merge template, though, of putting the plceholders on the document page or on the master page.
    If you put the placeholders on the master page, the data can be updated AFTER the merge, at least in theory (doesn't seem to work well with images), but it doesn't really sound to me like that's what you need, but it is a possibility if you have a basic documenet and the clients add text later.
    In any case, if you are letting the client add text, you need to add that text directly into your data file, either in Excel and re-export the entire data set, or set up some sort of find/change routine to replace tags in the text file with the text supplied by the client.

  • Newbie XML data source questions

    Post Author: tel
    CA Forum: Data Connectivity and SQL
    I'm trying to use the XML data source via http and i'm running into a couple of issues.
    First, we have separate environments for development, QA, and production.  Each environment will have a seperate URL to retrieve the XML for the report.  I would like to use the same report design to access each environment.  Is there a way to update the URL in the data source (or replace the data source with one that has the correct URL) without having to re-add the fields to the report design?
    The second issue is similar.  I can't seem to get Crystal to recognize changes in the XML format (new fields for example) without having to delete and recreate the data source (which automatically removes all the associated fields from the report).  Is there a way around this, or again, a way of re-creating the data source without it removing the existing fields in the report?
    Thanks,

    Post Author: rosariosanto
    CA Forum: Data Connectivity and SQL
    I have the same problem. Where can I find help about setting the data source when connecting to a web service. Since the hostname is hardcoded in the report, it is necessary to update it in the code.

  • Data Merge questions within CS 5.5...Please help

    Basically, we utilize this now with our calendar designs for schools which includes school events on certain days via the data merge along with building each calendar design from a 2 page spread with data merge. Now, the issue we are running into is....we send the school a proof of their calendar before we send it to print. At this time, they have an opportunity to add/delete/edit any events they want on our online portal for them in which we pull the info into our database to create the updated CSV file. Is there a way that once the first CSV file is merged to have it as a link or something similar? By this, I mean...it would be nice that when we re-generate the CSV file at proof time with the updated events that those will simply be updated in our indesign file by updating the link or something similar? We want to avoid re-building the calendar from scratch (2 page spread) at this point. Although for 70%+ rebuilding is not an issue, the other 30% are custom calendar that we have several edits to the 2 page template to start with. We dont want to have to do this custom work again or possible miss those custom changes when it is recreated. Any suggestions or help would be appreciated. Thanks!

    The good news is that you won't need to re-build anything. That said, I see 2 ways to update your file.
    Save the original as a template (before it's merged) and run the merge again once the CSV has been finalised. That way you always have the template of the first and each time you run the merge you get a new indd file.
    The other is based on the assumption that you added your merge fields into the document and not the master of the original file. If so, you can copy the document page content into the master pages, add the merge fields to the master and use this as a template. When you merge the document this time the CSV becomes a linked file in the resultant file.
    The differences between 1 and 2 is that the link to the CSV file is kept in the merged file in 2. Either way, it's a very easy update for you but you will have to re-apply your custom changes. This should just involve a straight copy/paste though.

  • Probably a basic data merge question...

    Hi,
    I am trying to create a multiple-record layout using data merge. The problem is that when I preview the document everything looks fine (first screenshot). When I actually create the merged docuement, however, the last two data entries from each page are moved off the edge of the page (second screenshot).
    If anyone has any suggustions or advice it would be greatly appreciated!
    Also let me know if you need any more information.
    Thanks in advance,
    Jack

    Your page number is being repeated with everything else. You'll need to do your merge without the numbers, and then add them to the merged document afterwards.

  • Another Data Merge question...

    Thanks, SRiegel, now that I’ve added the apostrophe in the title of the photo column, the Data Merge box DOES indicate photo. The problem is, when I do the Data Merge, it says the photos can’t be found. Here’s an example of how I have the photos listed:
    U:\MyPictures\employees3\smithjohn.psd
    Thank You

    Is that path correct? Can you place that image outside of data merge into another document, and doe it say that's the path?

  • Quick Data Merge question

    I've done this before but I only do it about once a year so I'll be damned if I can remember how to do it. However, I think I remember... does this sound about right?
    I'm doing 4 up postcards on an 8.5 x 11. I have one postcard in the upper left with text boxes in the address area.
    I linked up the excel document. Then I'm going to grab from the Data Merge panel "First Name" and drop it into one of the text boxes, then grab "Last Name" and drag into one of the text boxes, etc, etc. Then, I'm going to hit "Create Merged Document" and set it for "Multiple Record Layout"... but heres where I'm getting stuck. How do I tell it to do 4 up? When I hit the "preview multiple record layout" it just kind of shifts the one down and to the right a little but doesnt put it 4 up. Any thoughts?

    Yes, they will. But I think my "spacing" is goofed up.
    The layout is 8.5 x 11 horizontal so each one is 5.5" wide and 4.25" tall (4 up on a sheet). The one in the upper left should repeat to the right 5.5", down 4.25" and then to the right 5.5" and down 4.25" so its in the bottom right corner....
    I think I need help in the "Multiple Record Layout" section

  • When to use unattend.xml in task sequence - best practice?

    Hi, I've tried researching this but not found an answer to my specific query.
    We have ConfigMgr 2012 R2 with MDT 2013 although I don't think this is an MDT specific question.
    I'm trying to create a Build and Capture task sequence for our Windows Server 2008 R2 and Server 2012 /2012R2 server builds utilising an UNATTEND.XML file to make some customisations that can be deployed for every build afterwards in a Deployment Task Sequence.
    Specifically the addition of some Windows Features like SNMP and it's configuration and the addition of the Telnet Client. There are other bits like language settings and configuration items but I'm specifically interested in the Features part for my question.
    In CM 2012R2 you now have the option under the "Apply Operating System" to use a captured image or an original installation source. However they work differently if you specify the use of the same unattended answer file.
    The "image" deployment ignores all of the "add features" sections of the XML file and the "installation source" loses the  configuration options from SNMP from the XML file. When you then deploy the captured image using
    the same unattend.xml again the one from the "installer" now has all the SNMP features required and the one from the "image" is still missing everything.
    So my question is as follows.
    What is best practice for specifying an unattend.xml file in a task sequence. Is it in the build and capture TS or in the Deployment TS ?
    or
    Do I need multiple XML files, one for build and capture with some bits in and another for deployment with the rest in?
    or
    Should I be doing something else?
    Although this is specifically asking about Server O/S we will be using the same methodology for Windows 7 deployment.

    In this case DISM is only used to add the actual features... for configuration you could use a simple script that runs afterwards. Sample registry file:
    SAMPLE REG FILE - HKLM-SNMP.reg
    Windows Registry Editor Version 5.00
    [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\SNMP\Parameters]
    "NameResolutionRetries"=dword:00000010
    "EnableAuthenticationTraps"=dword:00000001
    [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\SNMP\Parameters\PermittedManagers]
    [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\SNMP\Parameters\RFC1156Agent]
    "sysServices"=dword:0000004f
    "sysLocation"=""
    "sysContact"=""
    [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\SNMP\Parameters\TrapConfiguration]
    [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\SNMP\Parameters\TrapConfiguration\public]
    "1"="127.0.0.1"
    [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\SNMP\Parameters\ValidCommunities]
    "public"=dword:00000004
    Sample batch file:
    SAMPLE SCRIPT FILE - ConfigureSNMPService.bat
    @ECHO OFF
    net stop "SNMP Service"
    regedit /s HKLM-SNMP.reg
    net start "SNMP Service"
    Also some settings for SNMP can be controlled through group policy:
    http://serverfault.com/questions/285762/group-policy-for-multiple-snmp-permitted-managers

  • HR Master Data conversion-SAP Best Practices

    Hello there,
    We would like to use the SAP Best Practices for HR Master Data conversion. 
    Now we want leverage the SAP Best practices to convert the Master data.  Could any one explain in detail how to do the same.
    How to install the Best Practices only to the extent of the Data conversion.  We don't want to use the rest of the Best Practicies.
    I know there are some notes out there. 
    Any help on the above highly appriciated.

    HI,
    I am not v sure if u can install only the required component. But there would be some pre requisites for every installation.
    It will be clearly mentioned in the base line
    Also Check if its available for the country which ur currently working...
    Use the ECATT: Test Configuratio & Test Scripts
    Pls revert in case u need further more details..

  • ISE policy creation question - best practices

    Ok, I am a rookie ISE user here and am trying to learn as I go. I have a 802.1x policy for our corporate users on both wired and wireless and a wireless guest policy that redirects to the guest portal to enter credentials created in the sponsor portal. The corporate user has access to corporate resources and the guest basically has access to just the internet.
    I need to make what I am calling a Vendor policy that is basically a hybrid of the corporate user and the guest user. These would be vendors that are on-site to assist with programming and need access longer than what the guest account can be created for. This would also have specific ACLs that grant them access to the specific resources they would nee. I would like to tie this into AD authentication since they have an AD account created to be able to access those corporate resources in most cases. My first question is do I have a single policy that is tweaked as vendors come and go or do I simply create a specific policy for each vendor? My second question is do I or should I create unique SSIDs for each vendor?
    As I said I am just now getting into getting ISE configured. I am just not sure of what is considered a best practice or what is considered a secure way to may things happen. In regards to the policies I have created, they work but I think I have a couple holes to address.
    Thanks ...
    Brent

    Mostly makes sense. I have the AD part just need to get an AD group created for my test subject.
    I created an Endpoint Identity Group to place the vendors devices into so that we can allow laptop to connect but not phone. Got that.
    I think I can handle the Authorization Profile. It will be something like if VendorAsset and AD1:ExternalGroups Equals VendorADGroup then VendorPermissions. VendorPermissions would be the ACL that limits where they can go. I also need to create a non 802.1x based SSID as well and add this to the Authorization profile but can still be generic enough to be useable by all vendors.
    I think it is my Authentication rules that I need to modify for Vendor as my Corporate based policies use Dot1x and I need a policy that does not use dot1x. Right?

  • Spry XML Data Set question

    Hello all, gotta question that I've been wresteling with for the better part of 2 days now. I'm running several filters on a very large data set. Each row in the data set is a product that has 8 or so properties. There are about 900 rows and each row has a SKU out of the 900 rows there are only about 50 unique SKUs. Right now every row is being displayed but what I want to do is only display each SKU once. I tried selecting 'remove repeating nodes" from the UI but this doesn't work because the entire nodes aren't repeating just the SKUs.
    Is this something that is easily doable?
    You can view the working application here (note: runs best in FF3, I need to trim down the DS to get it to work well in IE)
    www.bradygodwin.com/test/rnaToolNewData.html
    and the XML here
    http://www.bradygodwin.com/test/rnaindev.xml
    Thanks in advance for the help!

    Hi there
    You are indeed absolutely correct a spry region should have been shown, my appologise for that, the code is wrapped in a standard spry region.
    That being said i have used a work around in the SQL SELECT statement of the xmlExportObj, Recordset to find the information required without having to do any IF, ELSE on the page.
    Many thanks for your reply and for pointing out my mistake in how I had presented my question.
    My next question is to follow seperately
    Regards
    Ray

Maybe you are looking for