CC&B and Meter Data Management Integration Usage Requests

Hello.
We investigating the following issue: Trying to create a Bill in CC&B with getting usage data from Meter Data Management using Usage Requests engine. Help pages of CC&B describe the big picture of Usage Requests and configuration overview. So this information is not enough for correct and complete adjustments. There is no any information about outbound and inbound XML messages formats.
CC&B help contains link to the document: Oracle Utilities Customer Care and Billing - Meter Data Management Integration Implementation Guide, but our attempts of searching of this document was ineffectual.
It's interesting to get any information about Usage Requests from CC&B to Meter Data Management or link to the document mentioned above.
Thanks in advance!
P.S. Our current setup of Usage Request gives the next log during bill segment generation (we are using jms queues and xsl transformation of outbound and inbound messages. So MDM sends a response, but something wrong in further processing on CC&B side). As a result we get a bill segment in Error state with remark "Usage Request for Bill Segment is not found.".
Fragment from weblogic_current.log:
SYSUSER - 318312-3794-1 2011-01-19 14:40:55,547 [Remote JVM:1 Thread 3] WARN (host.sql.CobolSQLParamMetaData) COBOL set the null indicator to false for SQL bind parameter xWIN-START-DT, but it sent a null value ' '; binding null
SYSUSER - 318312-3794-1 2011-01-19 14:40:55,554 [Remote JVM:1 Thread 3] INFO (COBOL.CMPBBLLP) Invoking CMPBBILG
SYSUSER - 318312-3794-1 2011-01-19 14:40:55,554 [Remote JVM:1 Thread 3] INFO (COBOL.CMPBBILG) CMPBBILG validate
SYSUSER - 318312-3794-1 2011-01-19 14:40:55,572 [Remote JVM:1 Thread 3] INFO (COBOL.CMPBBILG) Get next Acct sa
SYSUSER - 318312-3794-1 2011-01-19 14:40:55,579 [Remote JVM:1 Thread 3] INFO (COBOL.CMPBBILG) Acct Sa more
SYSUSER - 318312-3794-1 2011-01-19 14:40:57,394 [Parent Reader:Thread-54] INFO (api.maintenanceObject.MaintenanceObjectLogHelper) (Server Message)
     Category: 11002
     Number: 12150
     Call Sequence:
     Program Name: Usage_CHandler
     Text: Transitioned from Pending to Awaiting Data Sync.
     Description:
     Table: null
     Field: null
SYSUSER - 318312-3794-1 2011-01-19 14:40:58,089 [Parent Reader:Thread-54] INFO (api.maintenanceObject.MaintenanceObjectLogHelper) (Server Message)
     Category: 11002
     Number: 12150
     Call Sequence:
     Program Name: Usage_CHandler
     Text: Transitioned from Awaiting Data Sync to Send Request.
     Description:
     Table: null
     Field: null
SYSUSER - 318312-3794-1 2011-01-19 14:41:03,720 [Parent Reader:Thread-54] INFO (support.context.CacheManager) Registering cache 'XAIOptionCache'
SYSUSER - 318312-3794-1 2011-01-19 14:41:06,920 [Parent Reader:Thread-54] INFO (domain.integration.RealtimeOutboundMessage) sending Realtime Outbound message <?xml version="1.0" encoding="UTF-8"?><SOAP-ENV:Envelope xmlns:SOAP-ENV="urn:schemas-xmlsoap-org:envelope"><SOAP-ENV:Body><CM_UsageCalculation transactionType="READ" dateTimeTagFormat="CdxDateTime"><rate>ERES-1</rate><saId>123456789</saId><usageId>954617571747</usageId><scalarProcessing><billingOption>Y</billingOption><startDateTime>2010-06-09-12.00.00</startDateTime><endDateTime>2010-06-10-12.00.00</endDateTime></scalarProcessing></CM_UsageCalculation></SOAP-ENV:Body></SOAP-ENV:Envelope>
SYSUSER - 318312-3794-1 2011-01-19 14:41:22,493 [Parent Reader:Thread-54] INFO (domain.integration.RealtimeOutboundMessage) Raw Response from External System is ID:325496.1295437282326.0
SYSUSER - 318312-3794-1 2011-01-19 14:41:23,966 [Parent Reader:Thread-54] INFO (support.schema.BusinessObjectInfo) BusinessObject C1-NonCyclicalUsgReqOutMsg: Skipping audit calls while performing DEL on entity OutboundMessage_Id(899647019991), as there were no auditable changes
SYSUSER - 318312-3794-1 2011-01-19 14:41:24,655 [Parent Reader:Thread-54] INFO (api.maintenanceObject.MaintenanceObjectLogHelper) (Server Message)
     Category: 11002
     Number: 12150
     Call Sequence:
     Program Name: Usage_CHandler
     Text: Transitioned from Send Request to Awaiting Bill Determinants.
     Description:
     Table: null
     Field: null
SYSUSER - 318312-3794-1 2011-01-19 14:41:24,939 [Parent Reader:Thread-54] INFO (api.maintenanceObject.MaintenanceObjectLogHelper) (Server Message)
     Category: 11002
     Number: 12150
     Call Sequence:
     Program Name: Usage_CHandler
     Text: Transitioned from Awaiting Bill Determinants to Bill Determinants Received.
     Description:
     Table: null
     Field: null
SYSUSER - 318312-3794-1 2011-01-19 14:41:25,892 [Remote JVM:1 Thread 3] WARN (host.sql.CobolSQLParamMetaData) COBOL set the null indicator to false for SQL bind parameter xWIN-START-DT, but it sent a null value ' '; binding null
SYSUSER - 318312-3794-1 2011-01-19 14:41:25,932 [Remote JVM:1 Thread 3] INFO (COBOL.CMPBBILG) Get next Acct sa
Edited by: Anton on 19.01.2011 1:56

Did you find the document Oracle Utilities Customer Care and Billing - Meter Data Management Integration Implementation Guide?
I searched on eDelivery and Oracle Support, but cannot find the document.

Similar Messages

  • Oracle Meter Data Management Integration with SAP ISU

    Hello Everyone,
    I am aware of the fact that Oracle has released a Media pack for the subject integration as discussed here and that media pack works only for SAP ERP 6.0 EHP 5 with ISU_AMI_2.
    We have got OU MDM Release 2.1 Media Pack v4 and SAP ERP 6.0 EHP 4 with ISU_AMI_1 and hence Adapter in MDM won't help to achieve integration with SAP as our SAP EHP is lower than what is required for the MDM adapter. We have got SAP PI which would help the SAP End in the integration.
    What are the possible work-around(s) to achieve the integration between Oracle MDM and SAP ISU without using the Media Pack/Adapter?
    How does MDM exposes its services (inbound/outbound) to be consumed/triggered by External systems (i.e. SAP ISU)?
    List of Services (inbound/outbound) exposed by Oracle MDM for integration?
    Your time and help is much appreciated.
    Regards,
    Adil Khalil

    Did you find the document Oracle Utilities Customer Care and Billing - Meter Data Management Integration Implementation Guide?
    I searched on eDelivery and Oracle Support, but cannot find the document.

  • Integration Oracle Utilities CC&B w/ non-Oracle bespoke Meter Data Management system

    Hello all,
    In your opinion, what's the best way to integrate a bespoke (non-Oracle) Meter Data Management system with Oracle CC&B 2.3.1 or 2.4?
    I came up with a few possible scenarios and would love to hear your opinion:
    1. Use CC&B Meter Read upload (MUP) batch process to upload reads from our bespoke Meter Data system.
    2. The bespoke Meter Data Management system adds Meter Reads to CC&B using a CC&B Inbound Service.
    3. At Billing time, CC&B invokes a Get Consumption Algorithm (invoked before the Bill Segment Creation Algorithm) which is responsible for requesting the reads from our bespoke Meter Data Management system.
    This is the approach used by the Oracle CC&B-Oracle MDM integration when it comes to Usage requests/transactions.
    4. Any other idea? Feel free to share.
    Thanks

    Hello all,
    In your opinion, what's the best way to integrate a bespoke (non-Oracle) Meter Data Management system with Oracle CC&B 2.3.1 or 2.4?
    I came up with a few possible scenarios and would love to hear your opinion:
    1. Use CC&B Meter Read upload (MUP) batch process to upload reads from our bespoke Meter Data system.
    2. The bespoke Meter Data Management system adds Meter Reads to CC&B using a CC&B Inbound Service.
    3. At Billing time, CC&B invokes a Get Consumption Algorithm (invoked before the Bill Segment Creation Algorithm) which is responsible for requesting the reads from our bespoke Meter Data Management system.
    This is the approach used by the Oracle CC&B-Oracle MDM integration when it comes to Usage requests/transactions.
    4. Any other idea? Feel free to share.
    Thanks

  • Script to Stop and Start Oracle Meter Data Management

    Hi,
    This is my first post in Oracle forums so if i open this in the wrong thread please forgive me.
    I tried to create a windows batch script to stop and start Oracle Meter Data Management but i can´t.
    @echo off
    D:\spl\MDMDEV\bin\splenviron.cmd -e MDMDEV
    D:\spl\MDMDEV\bin\spl.cmd stop
    D:\spl\MDMDEV\bin\splenviron.cmd -e MDMDEV
    D:\spl\MDMDEV\bin\spl.cmd stop
    The batch exits in the first command and i can´t understand why.
    Another thing is if i use spl.cmd stop command it doesn´t do nothing, is this normal?
    Thanks in advanced for all replies.
    Best Regards.
    Nuno

    I had to put a call command in mine. I think it is because you are trying to run a .cmd file within your bat file. You're actually changing shells. Thats why you need the call.
    call d:\spl\mdmdev\bin\splenviron.cmd -e mdmdev
    call d:\spl\mdmdev\bin\spl.cmd start
    I actually set mine up to use variables so all I had to change from script to script was the environment...
    <snip>
    SET SPLENV=MDMDEV
    SET INSTALLDRIVE=D:
    SET INSTALLDIR=SPL
    SET BINDIR=%INSTALLDRIVE%\%INSTALLDIR%\%SPLENV%\bin
    %INSTALLDRIVE%
    cd %BINDIR%
    REM Set envornment
    call %BINDIR%\splenviron.cmd -e %SPLENV%
    REM Wait a few seconds for environment to setup
    ping localhost -n 5 > nul
    REM Start SPL Web
    call %BINDIR%\spl.cmd start
    </snip>
    I was unable to find a way to get this to run as a service. Have you?
    I'm currently running these scripts in the startup/shutdown script section of the local machine policy (gpedit.msc).

  • What is data archiving and DMS(Data Management System) in SAP

    what is data archiving and DMS(Data Management System) in SAP
    Welcome to SCN. Before posting questions please search for available information here and in the web. Please also read the Rules of Engagement before further posting.
    Edited by: kishan P on Aug 31, 2010 1:06 PM

    Hi,
    Filtering at the IDoc Level
    Identify the filter object (BD59)
    Modify the distribution model
    Segment Filtering
    specify the segments to be filtered (BD56)
    The Reduced IDoc Type
    Analyze the data.
    Reduce the IDoc type (BD53)
    Thanks and regards.

  • EBS data and meta data retrieval

    Hi,
    I am into java and my requirement is to retrieve data and meta data of the underlying EBS implementation and make it available to the EBS end users for some kind of processing which is outside EBS.With my limited knowledge of EBS,what I understand is that there is a logical view and a physical view of the data.My queries are as below:-
    1. Is there a way to get the relationship between physical and logical view?
    2. How to get the meta data of the underlying table structures?
    3. Should JDBC drivers be sufficient to retrieve the data or is there any other more efficient means of retrieval.
    Any help on the same is appreciated as I am new into this area and any help is welcome.
    Thanks
    MNG

    Hi,
    1. Is there a way to get the relationship between physical and logical view?
    2. How to get the meta data of the underlying table structures?See Oracle eTRM website.
    eTRM
    https://etrm.oracle.com
    3. Should JDBC drivers be sufficient to retrieve the data or is there any other more efficient means of retrieval. I believe using JDBC drivers should be sufficient. For using JDeveloper, please see (Note: 330236.1 - Configuring JDeveloper For Use With Oracle Applications 11i and R12).
    Regards,
    Hussein

  • Point of sales (POS) and Product data management (PDM)

    Hi guys,
    please any one can tell me what is Point of sales (POS) and Product data management (PDM) in SD module and give me structures on this

    As i know it is not supported.
    Oracle supports counter sales order. But if performance is considered invoicing and receipt is slow in Oracle. So this can have workaround...

  • Used Relocate Masters to new g-drive. Lost Aperature Library and meta data

    how i crashed aperature
    1.56 aperature
    all updates current
    tiger 10.4.11
    all updates current
    Goal was to move Aperature Library from MBP to g-drive.
    what i did:
    1. "Relocate Master" to g-drive. Success. Aperature still viewed all Files with all Meta Data normally.
    2. Collapsed "Library" in Projects Panel. Did not delete Projects. Did not delete any Photos.
    3. Preferences > Image Management > set new Aperature Library on g-drive.
    4. Uploaded new photos from a Canon and viewed them successfully.
    5. Ejected g-drive correctly.
    what Aperature is not doing now:
    1. no Projects show up.
    2. no Referenced Photos show up.
    3. no Vaults show up. Note: the Vault was originally on MBP, but I foolishly did not set one up on g-drive before moving. I also have a back up Vault on a different g-drive that is, unfortunately, not up to date with all current photos before move.
    4. Aperature Library on g-drive is there but can not open any files in it. [note: item 3 above]
    5. Image files listed in g-drive but won't open in Aperature but do open in Preview or Gimp, etc.
    6. Aperature does not show any Vaults nor does it access Referenced Files.
    *Can you help ?*
    *I guess these would be the questions:*
    1. How can I get the pics from the g-drive back into an Aperature Library with all the previous Meta Data attached to each Version?
    2. How do I get the Aperature Library - I put on the g-drive - to open in Aperature?
    3. If #1 is impossible - I guess I set up a Vault on the g-drive and move what I have from the back up Vault mentioned above.
    *Thanks for any advice !!!*

    If I an not mistaken, I may have read through your pdf's quite a while ago. I am still trying to digest them and then apply them to my needs. If I had known that small projects may be faster and easier to manage than larger ones I would have done my imports quite differently. I chose to use Aperture in the first place to organize and add information to photos. I archive old photos. In particularly from 350th Fighter Group, WWII in the Mediterranean. So one "Project" of mine is =/-5000 images in the form of TIFF scans. I imported all of these into 1 project. Then I started applying metadata and keywwords like crazy. This allowed me to start to sort the photos into categories and find multiple version of the same image. I started making stacks of all the multiples. Then with smart albums and smart web sites I was able to create what might now be smaller projects. Since I gather information from others by sharing these photos on a website, You are able to see the smart websites here.
    http://web.me.com/vizcarraguitars/350thFighter_Group_Blog/350th_FG_Blog/350th_FGBlog.html
    One cool thing is that various images reside in multiple smart webs based on common keyword. But now that I have all this organization, it seems it is time to put all my originals in a similar order via Aperture.
    By the way, my files are reference files unless I make the mistake of importing them without leaving them in there original location.
    I am thinking of starting over with blue folders. But I can't seem to get my head around it.

  • Workflow for avchd clips and meta data

    Just started using a Panasonic AG-AF100 on a 5 month project.  Will be producing 1000's of clips.  So I have off-loaded the first 3 cards we shot and have discovered something about avccam material.  If you just copy the .MTS files from their /stream folder or rename the file with a name instead of its original number produced in the camera you lose the timecode and other meta data.
    If I leave the clips in that folder and other card folders in their associated directories and "import" clip with Premiere Media Browser it retains the timecode.  Trouble is that I cannot see thumbnails in Media Browser and the clip number means little.  Bridge on my Win7/64bit machine shows thumbnails (but no preview) and a double-click will preview in Win Media Player.
    On my main edit station (Vista/64bit) Bridge neither shows thumbnails or previews and I get no avchd .mts playback in Windows Media Player.  Everything worrks the same within Premiere, nice solid editing and playback of avchd clips. (I will probably upgrade this machine to Win7 next week)
    So my request for advice/questions are:
    Suggestions for managing and organizing large qualities of clips with what will be similar names, etc
    Previewing updates or options to find media
    How do you manage AVCHD material and still maintain meta data?
    Thanks so much!!  As I get older and the technology seems faster, stronger, sharper, the demands on my brain get tougher.
    Rick

    Rick,
    If you just copy the .MTS files from their /stream folder or rename the file with a name instead of its original number produced in the camera you lose the timecode and other meta data. 
    If I leave the clips in that folder and other card folders in their associated directories and "import" clip with Premiere Media Browser it retains the timecode.
    You mustmustmust keep the files in their folder structure, if you intend to maintain timecode. You've clearly discovered this Good you did so early. The folder structure thing is really not a big deal; there is very little need to manipulate these at the OS level, and the Media Browser in PPro makes it reasonably easy to bring these in for editing. You can pre-build a folder structure into which you can offload each card's worth of clips.
    Trouble is that I cannot see thumbnails in Media Browser and the clip number means little.  Bridge on my Win7/64bit machine shows thumbnails (but no preview) and a double-click will preview in Win Media Player. 
    On my main edit station (Vista/64bit) Bridge neither shows thumbnails or previews and I get no avchd .mts playback in Windows Media Player.  Everything worrks the same within Premiere, nice solid editing and playback of avchd clips. (I will probably upgrade this machine to Win7 next week)
    Windows 7 has built-in splitters and decoders for H.264 media, so that's the reason you can at least get thumbnails in Bridge and playback the files in WMP there. The Media Browser, even on Windows 7, won't show thumbnails as they are not saved into the folder structure the way that P2 saves a poster image into the folder structure. Note that even "standard" containers like AVI or MOV do not display a thumbnail in Media Browser.
    As far as Bridge is concerned: I know a lot of people like it, but from my perspective, it's just about worthless for video. This is largely an application designed for photography workflows that has been shoehorned into "working" with video. I don't even like it for AVI/MOV.
    The ugly truth is that Bridge is not going to be much, if any, help in this workflow. However, there may be another tool in the suite that will accomplish what you need: OnLocation. You won't get thumbnails, but you can browse through the folder structure without knowing you're doing it, import clips to preview them, and add metadata. You could do this for each card you offload. Then, in Premiere, simply use Media Browser to navigate to your OnLocation Project, and it will open up just like a folder, listing all your clips with the metadata you added. Granted, it is not a perfect workflow, but once you added that metadata, it will flow with the files wherever you move them around in the Adobe suite--it won't go elsewhere, as it's not injected into the files themselves. So long as you're editing in Premiere, you're good.
    Now, there is another thing you can do that may make clip management even easier, though it does require a bit of forethought in pre-production. It will really depend on the type of project you're shooting, but you can preload metadata on to an SD card and into the camera, and then automatically record this to the clips as they are shot! I don't have an AVCCAM to give you exact workflow steps, but I do this all the time with my P2 workflow and it's a tremendous help once you hit the edit bay.
    If you haven't already, head over to the Panasonic support site: Support Desk Top / Broadcast and Professional AV. You'll need to register for a "PASS" membership (it's free, and you don't need to register a camera), but you can then download the AVCCAM Viewer application. Unfortunately, this application is not quite as helpful as P2 Viewer is (it's rather cartoony, actually), but it does have the metadata editor in it that will let you create your metadata files in advance and save them to an SD card. Then, after consulting the camera manual , you can set those metadata files to be a data source for the footage as you record it. Here's what you can add to the metadata files and, summarily, the clips:
    Once you load this into the camera's memory, that data is saved into the clips, and is readable by Premiere. The "User Clip Name" field can be set to auto-increment (again, this is going off my knowledge of the way metadata works in a P2 camera, but the AVCCAM should be similar) so that each clip has a base name and then a serial number. When you're done with a particular scene/shot, you'd load the next metadata files, reset the indexing, and continue. Lather, rinse, repeat.
    The biggest hole in the metadata workflow, at least as it pertains to AVCHD/AVCCAM, is that there is no tool (that I'm aware of) that lets you edit this metadata after the fact. With P2, you can--there are a number of applications, both free and payware, that let you do this--but I've yet to find one that works with AVCCAM. Even Panasonic's own AVCCAM Viewer is incapable of this, it seems.
    The major advantage to adding file metadata like this is that it travels wherever the clips go, and into whatever application they land in. Not being able to edit it after the fact, though, just plain sucks. This is where the clip metadata you'd add in the Adobe applications comes into play, with the obvious disadvantage that it's only useful in Adobeland. As always, ain't no such thing as a free lunch.
    So that's my take on this--it's good you're looking into this before you have a couple terabytes worth of footage on hand Let me know if I can help fill in any details that I might have overlooked. And be sure to check on the Panasonic site linked above for firmware updates to the AF100--I see one came out just a few days ago. It's always good to stay current.
    Good luck!

  • Programs crash and meta-data is lost.

    Frustrating problem with this otherwise fabulous device. Should I return this device or is the consensus that these are software bugs that will likely be fixed over time in firmware upgrades.
    When using my ipod-touch (32GB 1.4 with software update) there will occur inevitably after a certain length of time a random crash in Safari or the music player (the screen turns black and goes back to the main screen). Any meta-data (like playcount, last played etc) that should have been collected on the songs that I have played up to that point is wiped out so that the only meta-data updates saved back to the computer (mac-mini OSX 10.4; Latest itunes) are those that occur subsequent to the crash.
    After one of these crashes, all of the media seemed to disappear from the device. It showed 25GB used and 5GB free but 0 songs, 0 videos and the device showed all the space taken over by Other. The ipod refused to sync saying there was not enough space. A restore fixed this and it has not happened again but either the music player or safari (whether or not music is playing in the background) will crash within an few hours of usage.
    I dont so much mind the crashes-at least the minor ones; but the meta-data loss is frustrating as I rely on that for smart playlists etc.

    I would go to an Apple Store and have them trade it in for a new one. You should still have a backup of the other iPod that you can restore the new one with. Getting a new one should fix the problem, and getting bugs earlier rather than TOO LATE when you don't have the warranty is a good idea.
    Hope I helped.
    -Messymeese

  • Catalog settings and meta data is there a better way.

    HI
    I have adobe cs4 and light room 2.2 I do my editing in LR cropping exposure web galleries and so on on, thats all well and good but any modifications i make will not show up in the bridge unless i go into light room and apply save meta data to files. I have about 400 light room catalogs as I store the catalog in the folder that contains the original image. Is there a way to have light room automatically apply meta data changes to each catalog in stead of going to catalog settings or saving the meta data to the file. its a pain in the *** to have to go into a catalog and apply all those settings if i want to do some editing in the bridge

    > ...is there a better way ...
    Might I humbly suggest that your multiple (400!!) catalog strategy is flawed? You have defeated the benefits of having a single database. (And those that argue a database is bad concept and a possible single point of failure are ... well ... simply wrong.)
    Lr can automatically write XMP data so that Bridge can read it. But the reverse is not true. Any changes that Bridge/ACR make will have to be imported into Lr with manually initiation.
    The point and beauty of Lr is that you make the majority of adjustments in Lr. It's possible to make adjustments in Bridge and have them imported into Lr but as you discovered it is not automatic. And your workflow choice makes it doubly (or 400 times) more troublesome.

  • Problems with Importing / Exporting Keywords and Meta Datas

    Hi to all!
    I recently upgraded to Aperture 3 and upgraded my referenced library.
    Today I opened the Keyword HUD and noticed some keywords splattered in my list, which seem to be any older ones, since the new numbering indicated that they are not applied to any pictures.
    So I deleted them.
    Than I noticed the <Imported Keywords> folder, opened it, and it contained also a large number of previous keywords. Also they seemed not to be in use, so I also removed them.
    Than I locked the Keyword HUD.
    Now my question: If I export a version, with the option for 'include metadata' ticked, and edit the version than in Photoshop, and after it, import it back into the Aperture library I do have the problem.
    I have tried 'Import Meta Datas' and and click 'Append'. Than it is recognizing the former keywords, which I would appreciate, but not as 'Imported Keywords'.
    If I would open the file w/ the External Editor and return it back, I guess I would not have this problem. Usually I do open the referenced RAW file, and import than the edited version back into Aperture. Keywords are stored in the Library, so I would not avail the former given keywords than, is that right?
    By the way are the keywords part of Meta Datas or not?
    Are there any workarounds?
    And had other people also problems with their keyword list after upgrading?
    Thank for any ideas / infos!
    Michael

    Added header to CSV and to Code
    $ImportFile = Import-Csv "C:\Users\username\Desktop\Scripts\Powershell\Epic\SCCM CI\Tags.csv" -Header Computer
    foreach ($Computer in $ImportFile){
    $path = "\\$Computer\c$\Epic\bin\7.9.2\Epic Print Service"
    $xml = select-xml -path "$path\EpicPullService.config.xml" -xpath //EpicPullService//Cleanup | Select -ExpandProperty Node
    if ($xml.ArchiveHours -eq '12' -and $xml.DeleteHours -eq '120') {
    $Compliance = $True
    }Else{
    $Compliance = $False
    } "$Computer","$Compliance" | Export-Csv "C:\Users\username\Desktop\Scripts\Powershell\Epic\SCCM CI\Results.csv"
    Results:
    select-xml : Cannot find path '\\@{Computer=SW1412-16985}\c$\Epic\bin\7.9.2\Epic Print Service\EpicPullService.config.xml' because it does not exist.
    At C:\Users\username\Desktop\Scripts\Powershell\Epic\SCCM CI\Check_PullServiceXML.ps1:4 char:8
    + $xml = select-xml -path "$path\EpicPullService.config.xml" -xpath //EpicPullServ ...
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo          : ObjectNotFound: (\\@{Computer=SW...vice.config.xml:String) [Select-Xml], ItemNotFoundException
        + FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.SelectXmlCommand
    If it is not a CSV file then just get it with Get-Content
    Get-Content C:\Users\UserName\Desktop\Scripts\Powershell\Epic\SCCM CI\Tags.csv |
        ForEach-Object{
            $computer=$_
            $path ="\\$computer\c$\Epic\bin\7.9.2\Epic Print Service\\EpicPullService.config.xml"
    ¯\_(ツ)_/¯

  • Business meta data management

    Hello,
    I could not find a way to capture business related meta data inside OWB, or I am missing something. I need to capture, business rules, validation , defintions etc inside the OWB. Could someone point me to the righ direction and steps.
    Thanks
    Syed

    Hi Syed,
    I think the way to model business rules today in very much related to the PL/SQL implementation, however it is very possible to add special fields in your repository that allow you to link to documents, urls and other things. The same will go for the browser where you can link to pages outside of the OWB content.
    To create a user defined property you can take a look at:
    http://otn.oracle.com/products/warehouse/sdk/Scripting%20Pages/Scripting_5_meta_meta.htm
    There it is described how you can create a user defined property where you can add the links or document links I mentioned.
    The release we are working on right now will have some very interesting features for you, but that is a few months out...
    Thanks,
    Jean-Pierre

  • Search Issue with OCR Files and Meta data

    We have setup SharePoint 2013 on windows 2012 server. It is have cumulative update until August. Its version is 15.0.4535.1000. 
    I have created two site content type of document type. Both content type have 4 site column which are common in them. Both have two site column separate in each. I configure search service application, which is also working very good and also don't have any
    kind of error. I have added this two content type in one document library which have lot of sub-folders
    All Files are OCR pdf which is done by meta data
    Now the issue is that when I search anything from search center, sometimes I will get result with data from site columns and sometime i will not get data but just link to file and Name of the file. Wired thing is that columns are common and they crawling Full
    and Incremental also properly.
    Below is the screen for my search where you can see that there is no summary. How can I bring summary
    So what can be the issue for this. 

    Hi,
    For metadata, which metadata are you not seeing? Are they custom properties within the PDF, and have you checked if you have crawled properties matching these?
    I know there's issue with last modified on PDF's (http://sharepointfieldnotes.blogspot.no/2013/05/understanding-and-getting-sharepoint.html) 
    Thanks,
    Mikael Svenson
    Search Enthusiast - SharePoint MVP/MCT/MCPD - If you find an answer useful, please up-vote it.
    http://techmikael.blogspot.com/
    Author of Working with FAST Search Server 2010 for SharePoint

  • "triggers" and meta-data

    Hello,
    I'd like to have a set of meta-data associated do documents like "last modification time" automatically updated when scripts using XQuery update execute.
    Am I right that there are triggers that could fire appropriately ? Is this planned for future versions of BDBXML ?
    Also, meta-data are readable in XQuery using the dbxml:metadata() extension, but not writable : is there any reason for this (other than like of time to implement ;-) it) ?
    Thanks,
    Fabrice

    Fabrice Desré wrote:
    Am I right that there are triggers that could fire appropriately ? Is this planned for future versions of BDBXML ?DB XML doesn't support triggers - that's something that the application would have to control.
    Also, meta-data are readable in XQuery using the dbxml:metadata() extension, but not writable : is there any reason for this (other than like of time to implement ;-) it) ?That's about right :-). It's certainly something we would like to add in the future.
    John

Maybe you are looking for