Bridge meta data handling and filtering

It would be great if you could add two features:
1.  adding keywords to multiple files currently if two files share the same  keywords but not in the order, adding new is impossible without  overwriting the existing keywords
2. filtering by location data, it would be great to filter by country say, just as by keyword, date or other.
Thanks, Erik

For adding keywords without removing existing keywords, this script might be of use to you...
http://www.ps-scripts.com/bb/viewtopic.php?f=19&t=2364&sid=57a50afd1b53b5fc195f0e5dfdbfab0 6

Similar Messages

  • SRM7.0 Meta Data Handling or Component Configuration

    Hi Experts!
    Let's say you for some reaseon want to remove the a field in the Shopping Cart at item level. To achieve this you could e.g.
    1. Use the Component Configurator (SE80) - and use the browser-based Editor for the WDP ABAP Component Configurator
    2. Use the Meta Data Handling Concept and configure the field control there
    My guess is that one should use #2 if one need to control the behaviour of the field according to roles or authorities, but is this correct, or are there other considerations that should be taken into account when performing UI adjustments using the above mentioned methods?
    brgs Ziggy

    Check the below link, this should help you.
    http://wiki.sdn.sap.com/wiki/display/SRM/HowtoaddCustomerfieldsinSAP+SRM
    Regards,
    Jagadish

  • Data Mining on data specified and filtered by the user in runtime

    Hi Experts,
    i am new to Data Mining in SAP BI (we are on BI 7.0 SP Level 20). I familiarised myself with APD and Data Mining by reading some interesting and useful threads in this forum and some other resources. Therefore I got a understanding about the topic and was able to create basic data mining model for an association analysis and an corresponding APD for it and write the results into a DSO by using the data source. But for now I was not able to find a solution for a concrete customer requirement.
    The user shall be able to select an article, a retail location and a month and get the top n combinations sold with that article in the particular location and month. For that he may not access the data mining workbench or any other SAP internal tools but he shall be able to start the analysis out of the portal (preferable a query).
    We had some thoughts on the scenario. The first idea would be to create an APD for every location for the last month. As we need to cover more than 100 locations, this would not be practicable. Therefore I think it would be necessary, that the user can select the particular filters, and the data mining would then be executed with the given input.
    The other idea was to use a query as source. The user would start this query and filter location and month in it. The result of the query could then be used as the source for the APD with the association analysis. Therefore we would need to create a jump point from that query, which starts the APD with that results. After that the user should be able to start a result query, which displays the result of the association analysis (ideally this result query would start automatically, but starting it manually would be ok, too).
    So, I have the following questions for these scenarios:
    1.) Is it possible to create variants of a single APD, for automatically doing the data mining for the different locations?
    2.) is it possible to start an APD out of a query, with the particular results regarding filtering?
    3.) Can we place a query directly on the data mining results (how?) or do we need to write the data mining results in a DSO first?
    4.) What about the performance? Would it be practicable to do the data mining in runtime with the user waiting?
    5.) Is the idea realistic at all? Do you have any other idea how to accomplish the requirement (e.g. without APD but with a query, specific filter and conditions)?
    Edited by: Markus Maier on Jul 27, 2009 1:57 PM

    Hi ,
    you can see the example : go to se 80 then select BSP Application ,SBSPEXT_HTMLB   then select tableview.bsp , you will get some idea to be more clear for the code which you have written
    DATA: tv TYPE REF TO CL_HTMLB_TABLEVIEW.
    tv ?= cl_htmlb_manager=>get_data(
                             request = runtime->server->request
                              name    = 'tableView'
                                  id      = ''tbl_o_table" ).    
    IF tv IS NOT INITIAL.
      DATA: tv_data TYPE REF TO CL_HTMLB_EVENT_TABLEVIEW.
      tv_data = tv->data.
    IF tv_data->prevSelectedRowIndex IS NOT INITIAL.
    FIELD-SYMBOLS: <row> LIKE LINE OF sflight.
        READ TABLE ur tablename  INDEX tv_data->prevSelectedRowIndex ASSIGNING <row>.
        DATA value TYPE STRING.
        value = tv_data->GET_CELL_ID( row_index    =
                                   tv_data->prevSelectedRowIndex
                                      column_index = '1' ).
    endif.
    endif,

  • Data handling and graphs

    I originally wrote the graph program to handle the data in a text file that was organized like this...
    vertex a
    vertex b
    vertex c
    vertex d
    edge a c
    edge a d
    edge d b
    and now I have to change the main to accept a datafile containing...
    a b
    b c
    c e
    d g
    g c
    Now, here is a copy of the main program as it currently stands...
    import java.io.*;
    import java.util.*;
    public class TopSort
         static Graph theGraph = new Graph();
         static Hashtable hashList = new Hashtable();  //just to store array index
         public static void main (String args[])
                 MyInfoachaffin myInfo = new MyInfoachaffin();
              myInfo.info();     
                 File sourceFile = new File(args[0]); // access the file
                 if (sourceFile.exists() == false)                                          
                   System.err.println("Oops: " + sourceFile + ": No such file");
                   System.exit(1);
                 String newVertex, startEdge, endEdge;
                 int arrayPosition = -1;
                 try //open the file
                           FileReader fr = new FileReader(sourceFile);
                           BufferedReader br = new BufferedReader(fr);
                        String input;
                        while ((input = br.readLine()) != null)
                             StringTokenizer toke = new StringTokenizer(input);
                                while (toke.hasMoreTokens())
                                  if (hashList.containsValue(toke))
                                               return;
                                     else
                                          newVertex = toke.nextToken(); //get vertex
                                          theGraph.addVertex(newVertex); //add into graph
                                          arrayPosition++; //increment counter
                                          hashList.put(newVertex, new Integer(arrayPosition));
                                          //add position with vertex as key
                                     /*else if (toke1.equals("edge"))
                                          startEdge = toke.nextToken(); //get edge
                                          endEdge = toke.nextToken();  //get vertex
                                          Integer temp = ((Integer)hashList.get(startEdge));
                                          int start = temp.intValue(); //find position with key
                                       Integer temp2 = ((Integer)hashList.get(endEdge));
                                          int end = temp2.intValue();  //find position with key
                                          theGraph.addEdge(start, end); //add edge
                                }//close inner while
                       }//close outer while
                       System.out.println("The hashtable contents: " + hashList.entrySet());
                        br.close();
                  }//close try
                      catch (IOException e)
                                System.out.println("Whoops, there's a mistake somewhere.");
                          theGraph.graphSort();
       }//end main
    }//end class I have managed to seperate the vertexes from the edges and stored them in a hashtable so as to be able to remember their location in the array itself. My question is is there a way to go through a file a second time and pull the vertexes from that file or, conversely, should I store the data in the file into, oh I dunno, a linked list or something that I can pull the data from? The graph part of the program works so I didn't add that file and the setting the vertexes works fine too...I'm just stuck on how to handle the edges, exactly, so does anyone have any advice? Right now, the adding edges is commented out because that was how I handled it with the original data...

    Whoa, you're freakin' me out.
    All you gotta do is read in the data in a new format. You can either translate the new format to the old format, or you can write a method that creates objects from the new format explicitly. That's all there is to it.

  • Golden Gate Data Transformation and Filters

    Hello All,
    I have a scenario wher I need to transform the data on the go. In short, I have 2 customers, using the same software and they are each customer_id =1 in their respective application.
    However, customer_id=1 has the other customer as customer_id=55 in his application and vice versa customer_id=1 has the other customer as customer_id=12.
    I know I can already setup filters for the particular data I want to replicate. However, I don't know if I can transform it before the data is replicated.
    Here is the scenario:
    Replication from Customer A(customer_id=1) to Customer B(customer_id=55)
    ==========================================================
    -->  FILTER inserts, updates, deletes where customer_id=55
    ---> TRANSFORM customer_id=55 to customer_id=1
        --> APPLY EXTRACTS to Customer B
    Replication from Customer B(customer_id=1) to Customer A(customer_id=12)
    ==========================================================
    -->  FILTER inserts, updates, deletes where customer_id=12
    ---> TRANSFORM customer_id=12 to customer_id=1
        --> APPLY EXTRACTS to Customer A
    Thanks in advance for any advise or recommendations you might have on the subject.
    Sincerely
    Jan S.

    I've done some research and look like we are talking about Oracle Data Integrator ..... I hope not. don't want to another layer of complexity.
    I hope there is someone out there that has alternative solution.
    Thanks in advance for any assistance in this matter.
    jan S.

  • Webforms - data handling and storage

    Hello,
    I would like to know what is the best recommended practice / best apps methodology for the following scenario :
    A Custom web form with options to enter data, add attachments and submit. On submission , the workflow gets triggered, followed by approvals and finally data update into the relevant transaction.
    My query :
    - > For saving the form data and attachments , the options which I am aware are :
    1 )  tables
    2) case management, into the transaction using GOS.
    3) DMS
    4 ) Workflow container
    5) Business document services
    Any more to add to this list would be really good.
    In such scenario , what are the best possible way to be used ?
    Would be good to hear your thoughts on this.
    Thanks alot
    Saujanya

    HI,
    In one of my projects we also had a similar kind of scenario the below was teh approach we tried and it worked.
    1. Store all the relevant data fields Header and Item level data in the table and also maintained relation ship between these two tables.
    2. Now when it comes to attachment first we created a unique guid for each attachment and the relevant document we tried to store in the DMS system. so when ever we require teh document then we use to rfer the GUID which was generated.
    3.AS usual when he clicks on subit we tried to create a custom business class and rasied the event through SAP_WAPI_CREATE_EVENT.
    4. If attachment is required in the workflow then we use to refer in tables.
    May be it might help you to start...... Have a great day
    Regards
    Pavan

  • 4503/4506 - data handling and data flow

    I have been tasked to find a document that details how, from start to finish, a packet goes through a 4503/4506 unit. I'm talking a level of detail that includes what portion of RAM the packet goes into once it hits the inbound interface, what parts of the switch handle the analysis (ACLs, et all), and so on, right until the packet is either dropped or forwarded to the outgoing int. As detailed a description as possible, and if a non-model-specific equivalent is available and applicable to this unit, that works as well.
    I have been looking through the TechDocs and the like, as well as several attempts at Google (which is well-nigh useless), and no luck thus far.
    Thanks in advance for any information provided.

    I am not aware of any CCO documents explaining path of a packet/CAT4500 architecture. However, there was a presentation on this at Networkers 2005. If you attended it, you can check it out at
    http://www.cisco.com/networkers/nw05/nwol.html
    Here is the session information for RST-4500.
    Session Title: Cisco Catalyst 4500 Switch Architecture
    Length: 2 Hours
    Level: Intermediate
    Related Sessions:
    RST-3031 Troubleshooting LAN Protocols
    RST-3042 Troubleshooting Cisco Catalyst 4000 and 4500 Series Switches
    Abstract: This session presents an in-depth study of the architecture of the Cisco Catalyst 4500 Switches and how the various components work together. The focus for this session is present information to help the audience understand the architecture to be able to design and implement Catalyst 4500 in their network and troubleshoot better.
    Topics include a discussion of the architecture, information about the latest Cisco Catalyst 4500 supervisors and switching modules such as SupevisorV-10GE, Netflow Feature card (NFL), Catalyst 4948, Supervisor II+TS amd PoE linecard, as well as the key features such as CEF/Multicast Forwarding, DHCP Snooping, IP Source Guard, Dynamic ARP inspection, 802.1X, Redundancy (SSO), Netflow, ACL/TCAM, QoS (Per-port/Per-VLAN and UBRL),
    This session is designed for network designers and senior nework operation engineers considering deploying or have Cisco Catalyst 4500 series of switches in enterprise and service provider networks.
    * Prerequisites:
    1. Knowledge of LAN protocols is required
    2. Basic understanding of Cisco Catalyst switches is required.
    Topics include a discussion of the architecture, information about the latest Cisco Catalyst 4500 supervisors and switching modules such as SupevisorV-10GE, Netflow Feature card (NFL), Catalyst 4948, Supervisor II+TS amd PoE linecard, as well as the key features such as CEF/Multicast Forwarding, DHCP Snooping, IP Source Guard, Dynamic ARP inspection, 802.1X, Redundancy (SSO), Netflow, ACL/TCAM, QoS (Per-port/Per-VLAN and UBRL),
    Speakers: Balaji Sivasubramanian
    Escalation Eng
    Cisco Systems
    Balaji Sivasubramanian is an escalation engineer in Cisco's Gigabit Switching Business Unit. Balaji, who is a CCNP, is also co-author of "CCNP Self-Study: Building Cisco Multilayered Switched Network - 2nd Edition" (ISBN -1587051508). Balaji is an expert in Catalyst 4500 switches architecture, and in troubleshoooting LAN protocols and Catalyst switches including the Catalyst 4500, Catalyst 6500 and Catalyst 3500. In his 5+ years with Cisco, Balaji has also held positions of TAC Technical Leader/Expert in LAN/Campus switching, Worldwide Subject Matter Expert in LAN technologies for the Cisco TAC and a TAC support engineer in LAN/Campus switching.

  • Color Application with Bitmap Data, Threshold and filters

    Hello,
    I want to simulate lipstick in an image over the lips.
    To apply colour to the image we use the method threshold to a BitmapData.
    Then we generate a Bitmap with the image whoch we have applies the colour. We determine a BlendMode for the BitmapData.
    We apply the following filters:
    BlurFilter
    BevelFilter
    ColorMatrixFilter
    ConvolutionFilter
    GlowFilter
    GradientBevelFilter
    Then we create the image and assign the .source of the BitmapData with the filters and the specific BlendMode.
    We add this to a canvas with the addchild.
    The problem is that We obtain an image that's good but it doesn't seen as an image that glows or an image superposed with the previous.
    It's not very "real". I need like more shiny.
    Can you help me?
    Thank you very much!

    Just to clarify, the site I'm having trouble with is at the link at the bottom of my first post.  The first one was my working example.  This one's broken: http://www.equestrianarts.org/resources.html
    I'm looking at it in Explorer 7, and all I get is the names of the column headers in {brackets} where there should be lists loading from my html table.  Does anyone else see the same thing?
    Thanks,
    Maria

  • Can I copy meta data from a jpg and past to a tiff. I have CS6 photoshop

    How can I copy metadata from a JPg files and paste onto tiffs.  I send JPGs to stock agent and when they select I then send Tiffs and want to avoid re typing meta data if possible.  Also some pictures have similar meta data information and can I copy that and paste into another picture?
    Please give me keystroke information if this is possible as I get confused easily.  Use photoshop for dummies language if you can.
    thanks a lot  Gail

    You can copy&paste individual fields, but not the entire data. It's a moot point, anyway. PS will retain metadata e.g. when a file is saved using Save As... to a format that supports metadata and JPG and TIFF both do. also one would simply use Bridge to edit metadata in mass with multiple images selected...#
    Mylenium

  • PPro CS3 fails to read Meta data

    PPro CS3 fails to read Meta data in some files i have imported. It fails to read the start of the timecode and the tape name:
    Files imported into PPro CS3: http://web.comhem.se/averdahl/pproCS3_meta.png
    The same files imported into PPro 1.5: http://web.comhem.se/averdahl/ppro15_meta.png
    The first two clips are captured in PPro 2.0, may 2006, and the rest is captured in PPro CS3, august 2007. I have then used the Split function in Scenalyzer Live to split the captured material into smaller clips. I added the Tape Name in Scenayzer Live.
    The Meta data, timecode and tape name, can be read in Scenalyzer Live, Adobe Bridge 2.10.100 and PPro 1.5 but PPro CS3 refuses to read some of it. I have not PPro 2.0 installed right now but will give it a shot later. I have deleted Media Cache Database and reimported the clips in CS3 to no avail. I can create a new Project and import the files with the same reults, some meta data is skipped, so the Project itself is ok.
    Do you know why this happens? Does it happen to anyone else? Do anyone have a solution?
    /Roger

    Make sure that the file has a .html or another known file extension to make sure that the server recognizes it as an HTML file.
    The server may send unknown files automatically as text plain.

  • JCO - Get meta data - Grey status in JCO connections tab

    Hi all,
    I read many post on this topics within this forum but I didn't succeed to get meta data required and to set correctly my connection.
    Let's start quickly to give you my conf
    In the j2ee admin
    -> Services > SLD Data Supplier > CIM Client Generation Settings
    I set fields with : SLD7000 / 50000 and my logon info
    Save and test -> result is ok !
    In my web dynpro application, I am able to get correct BAPI from BCE system using my logon info. And I map my view to display data from the corresponding BAPI called.
    So at this time, in my IDE, the connection to the SAP Backend system SAP works because I can get the list of BAPIs from the BCE SAP Backend System as well.
    Build and deploy my application.
    Now, let's go to webdynpro welcome page on my j2ee server : http://<host>:<port>/webdynpro/welcome
    -> content administration
    -> I select from the left tree browser my application, and open the "JCO Connections" Tab. I can see my connections previously I set in my webdynpro application. But I CAN'T use the button "Create" to configure new JCO connection. ??
    Last thing:
    in the BCE backend system
    transaction : RZ70
    I have, host = sldmain and service = sapgw47
    Also, in the list of data collection programs, everything is checked as active except the program "_SLD_RFC"
    I don't know if it is important....?
    Finally, I give you from the exception I got in my browser only lines "Caused by"....
    Caused by: com.sap.dictionary.runtime.DdException: TypeBroker failed to access SLD: Error while obtaining JCO connection.
    Caused by: com.sap.tc.webdynpro.services.sal.sl.api.WDSystemLandscapeException: Error while obtaining JCO connection.
    Caused by: com.sap.tc.webdynpro.services.exceptions.WDRuntimeException: Failed to create J2EE cluster node in SLD for 'J2E.SystemHome.localhost': com.sap.lcr.api.cimclient.LcrException: CIM_ERR_NOT_FOUND: No such instance: SAP_J2EEEngineCluster.CreationClassName="SAP_J2EEEngineCluster",Name="J2E.SystemHome.localhost"
    Thanks a lot for your support and help,
    Best regards,
    Emmanuel.

    Hi Emmanuel
    You said
    " I select from the left tree browser my application, and open the "JCO Connections" Tab. I can see my connections previously I set in my webdynpro application. But I CAN'T use the button "Create" to configure new JCO connection. ??"
    What is their status ? Is it green on red? Usually create is greyed out if it is green.  What about Edit?
    Try creating a new Web Dynpro component with another destination name ,deploy and then check the Content administrator if these new destinations have Create enabled
    After running RZ70 did you check the Technical System in SLD administration to see if the technical details have been propogated ?
    Regards
    Pran

  • 844,845 & 849 seeburger meta data &maps

    Hi,
    we have Seeburger EDI adapter ,but we need to implement interfaces using transaction sets 844,845,849.but we don't see any meta data info and Maps for those in the software we have downloaded.
    so how can we get them?do we need to request from Seeburger or download from Service place?
    thank you,
    Sathish

    Hi
    Follow thses links it will help you
    SEEBURGER EDI adapter
    What are the seeburger adapters needed?
    XI Seeburger adapters
    Seeburger Adapter - Mappings
    Regards,
    Santosh

  • Meta data for streaming mp3s?

    Hello
    I'm using Flash Media Streaming Server and I read that I can
    not obtain meta data on mp3s using FMS.
    Is this true?
    I want to be able to seek/scrub audio tracks, but I would
    probably need the duration of the mp3.
    If anyone can shed some light on this that would be great!
    Thanks
    Ryan

    many nodes of the faces-config have xxxx-extension . Some of them will be added in the JSF 1.2. This is a place where the meta-data is stored. The JSR-276 ( http://jcp.org/en/jsr/detail?id=276 ) is dedicated to Design-time Meta-data standartization. The initial proposal is here: http://www.jsfcentral.com/reading/index.html#Proposals
    The JSR is just created. So, it is about the future. Currently, each tool has its own meta-data format and uses it is own way how the custom components are presented and work inside the tool.
    Sergey : http://jsftutorials.net

  • Photo Meta data stripped out

    Other than a small savings in file size why would iWeb strip all of the photos metadata, i.e. keywords, ©info, owner name, description, even the filename itself, etc. out of photos added to iWeb.
    Is there any way around this, I've tried drag and drop from a folder with the same results.
    Considering Apple makes it's living on copyrights why do they make it so difficult for the rest of us to reserve ours?
    Otherwise the new iWeb rocks!

    You must be talking about non-standardized meta data, as in iphoto meta data or some other apps???
    Because I'd geo-tagged my photos with standardize meta data (EXIF) and iweb does not strip it off.
    I can see iweb resize the photos, but EXIF meta data is intact, I can build full google earth and google map file from iweb photo page rss.xml file.
    This google map is based on iweb photos page: http://home.cyclosaurus.com/AlumRock_Park_GoogleMaps.html
    and this is the photos page, the photos are geotag with standard EXIF:
    http://www.cyclosaurus.com/Home/PhotoAlbums/Pages/Alum_RockPark.html
    You can download my photos and you will see EXIF meta data is there.

  • No image preview in Bridge and unable to write meta data files

    I recently converted my CR2 files to DNG files using the DNG converter. For the most part it worked. However, for some files, something went wrong. When I looked at a folder of newly converted dng files using Bridge, I noticed that some files didn't have a preview. These same files won't allow meta data to be written either. Are these files corrupt? What is wrong and how do I fix it? I tried to open these files in CR and they do open. Wierd thing is that if I open the file in CR, then click save file as dng, it will then show a preview in Bridge.

    Yes. The original RAW file (CR2) allows me to write metadata. In fact, if I write the metadata to the CR2 file first, then convert to DNG, all the metadata is there - even for the files that at first do not have a preview in Bridge.
    But the reason I'm converting to DNG is because I want my original archive files to be in a non-propriatory format.
    Even though I can get around the current issues of no preview and writing metadata, my main concern is whether or not there is a problem with the conversion. I don't want to find out later that the file is corrupt. I'm worried that the issues with previewing and writing metadata are symptoms of some bigger problem.
    Any ideas?

Maybe you are looking for

  • Microsoft Word and Xcel crashing on Yosemite!

    Running MS Office 2011 for MAC - I upgraded to Yosemite soon after it's release and had problems with word and xcel both crashing.  Researched the problem and found duplicate font files were suspect so I cleaned that up and things seemed to be workin

  • Using XML Schema in SFP.

    Hi..       We have requirement to send the form output in xml file. The xml file has been picked up PI system and post it in SAP. But the PI conversion routine looking for specific field names which are usually more than 30 Char. length. So its not p

  • How can I map half the keyboard to another instrument

    Hey, Let's say that I want the left half of my M-Audio keyboard to play a Chruch Choir and the right half to play Piano, how should I proceed? Thanks,

  • Characteristis value

    when we do change the BOM it doesnt pick the new Characteristis value based on the updated BOM instead of that it picks the old value (value updated on the first production order creation) In standard test certificate value is updated from sales orde

  • Can anyone help me with sql for this issue?

    I need to get the data from Last year starting Date From 01-Jan-2012 to current date i.e. 27th June 2013... Upper limit I can have as SYSDATE. Can some one help me how to get the start date for last year?