DB Agnostic Dates

I'm trying to create a Universe that can use connections to (just about) any database. Specifically DB2, SQL Server and Oracle for the time being.
Everything works fine with the exception of dates to be used in report prompts. DB2 dates used in prompts fail for lack of the time elements in the calendar widget.
To get them all to midnight (or non-timestamped) date:
A DB2 timestamp field has to be defined as Date([field])
An Oracle datetime needs to be trunc([field])
SQL Server need to be DateAdd(dd,DateDiff(dd,0,[field]),0)
But, of course, I can't have all 3 of those definitions in a single object, I don't think...
Do I just have to reference the field and format it in the Universe to the longest format? Or do I need to hack some config file? Can I add functions to the .prm files for each database flavor with the same names so that applying a function is applying the logic I need for each?
Searches don't turn up this specific issue.

Jody,
The only thing that I can think of is to define database functions in each database that are called the same so that you actually call the function instead.
Not sure on DB2, but calling a function in Oracle and SQL Server is the same - you'll have different syntax at the database level but the same function name at the universe layer.
A bit long-winded but database agnostic nonetheless.
Regards,
Mark

Similar Messages

  • Best way to store Timestamps in files/databases?

    Hi,
    I also need Timezone information for the timestamps - im sending the info in an xml-file between two servers, and finally insert information in a postgresql database. Best format for this? Maybe use long-values?
    Any help much appreciated!

    So, I can use a timestamp in the xml-files I don't know what this means. If you mean you take a Timestamp's long millis value and shove it in the XML, without regard to TZ, and then extract it in the same way, yes, you can do that.
    and also
    use timestamp in dbLike I said: Create a Timestamp, use PreparedStatement.setTimestamp to push it into a DATETIME column of the DB, and you're golden. You're just marking an instant in time. Like an announcer on global TV is going, "NOW".
    - and when displaying the
    timestamps, I can use a SimpleDateFormat to format
    the timestamp with any TimeZone I wish?Yup.
    Also (thanx again for your patience!), I have the
    log-file dates. Here I parse out a String like
    '2007-12-10 22:10:00 0200' for each file.Okay, so you probably want a SimpleDateFormat that knows how to parse the "0200" on the end into appropriate TZ information to give you a TZ-agnostic date.
    That is, that's +2 hrs from GMT, right? So when you parse that, you get a java.sql.Date that represents "that instant." It's 10:10 p.m. there, 5:10 p.m. somewhere else, 3:10 a.m. somewhere else, etc., but that doesn't matter. 10:10 p.m. here is exactly the same as 11:10 p.m. in the next TZ. Once I specify 10:10 p.m. here, I've also implicitly stated 00:10 CDT tomorrow in Chicago, etc.
    Remember, "now" == X:00 PDT == X+2:00 CDT == X+3:00 EDT == X+ 11:30 or something India TZ, etc.

  • How to make BI Info obj Data element in-herit source DE documntn?

    Say I am in ECC. I go to SE11, give table name MARA and then doubleclick on the dataelement 'MATNR'. Then I click on documentation. I get a popup with Short text " Material Number" and Definition "Alphanumeric key uniquely identifying the material'.
    I am interested in the latter information that is 'Definition' - & in general whatever documentation comes up for data element in ECC.CRM.SCM,SRM.
    Now I log into SAP BI. I find that under characteristic 0Material, the data element is /BI0/oimaterial. When I double-click this datamaterial and press the documentation button, the system says 'No documentation'.
    My Questions:
    1. IS there a switch we could turn on in source ECC/SRM/CRM/SRM so that the data element in SAP BI inherits the original source field data element documentation.
    { I am not too convinced of the argument- that in BI we have info objects in ECC we have fields - since I am talking of data element level information- I would tend to think of this as an oversight of the designers or de-prioritization !!}.
    2. Could we have an ABAP workaround? That is, in BI we identify the tables that house the mapping between the source and destination data elementsa and take out this information. Then we extract the dataelement documentation by function DOCU_GET (from eCC) and use the mapping info above to link SAP BI data element with source data element documentation.
    WHY do I want to punish myself as above? My use case is, we take out SAP BI Table, field, metadata etc and create a model in a modeling tool and physical implementation in a 3rd party DW database as our own canonical, application-agnostic corporate datawarehouse. I would want the source data element documentation to flow to this last system as well as the modeling tool.
    Regards
    Sasanka

    That is, in BI we identify the tables that house the mapping between the source and destination data elementsa and take out this information. Then we extract the dataelement documentation by function DOCU_GET (from eCC) and use the mapping info above to link SAP BI data element with source data element documentation.
    1) SAP don't supply this, I would imagine, because R/3 isn't the only possible source of data.  I'm currently working on an app that extracts from R/3, an Oracle database and APO.  From whence should I take the documentation for my MATERIAL info object?  While being able to transfer the documentation of data elements might be very useful for your app, I can't see that generally it would be of interest to clients - I've certainly never heard of such a requirement.  So, my opinion at least, it isn't a design flaw.
    2) As you've pointed out, you can get the tables that do the mapping, so you know the source data elements, so you can get the documentation.  I'm not sure of how to store the documentation, but the obvious candidate for a link between infoobject and dataelement would be master data of an own infoobject.  You could wrap DOCU_GET in an RFC (if it isn't RFC enabled), and do a direct call between your 3rd party app and r/3 to get the documentation.  For information about the mapping tables, I'd suggest asking that question specifically in one of the BI forums.
    matt

  • Unicode Migration using National Characterset data types - Best Practice ?

    I know that Oracle discourages the use of the national characterset and national characterset data types(NCHAR, NVARCHAR) but that is the route my company has decide to take and I would like to know what is the best practice regarding this specifically in relation to stored procedures.
    The database schema is being converted by changing all CHAR, VARCHAR and CLOB data types to NCHAR, NVARCHAR and NCLOB data types respectively and I would appreciate any suggestions regarding the changes that need to be made to stored procedures and if there are any hard and fast rules that need to be followed.
    Specific questions that I have are :
    1. Do CHAR and VARCHAR parameters need to be changed to NCHAR and NVARCHAR types ?
    2. Do CHAR and VARCHAR variables need to be changed to NCHAR and NVARCHAR types ?
    3. Do string literals need to be prefixed with 'N' in all cases ? e.g.
    in variable assignments - v_module_name := N'ABCD'
    in variable comparisons - IF v_sp_access_mode = N'DL'
    in calls to other procedures passing string parameters - proc_xyz(v_module_name, N'String Parameter')
    in database column comparisons - WHERE COLUMN_XYZ = N'ABCD'
    If anybody has been through a similar exercise, please share your experience and point out any additional changes that may be required in other areas.
    Database details are as follows and the application is written in COBOL and this is also being changed to be Unicode compliant:
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    NLS_CHARACTERSET = WE8MSWIN1252
    NLS_NCHAR_CHARACTERSET = AL16UTF16

    ##1. while doing a test convertion I discovered that VARCHAR paramaters need to be changed to NVARCHAR2 and not VARCHAR2, same for VARCHAR variables.
    VARCHAR columns/parameters/variables should not by used as Oracle reserves the right to change their semantics in the future. You should use VARCHAR2/NVARCHAR2.
    ##3. Not sure I understand, are you saying that unicode columns(NVARCHAR2, NCHAR) in the database will only be able to store character strings made up from WE8MSWIN1252 characters ?
    No, I meant literals. You cannot include non-WE8MSWIN1252 characters into a literal. Actually, you can include them under certain conditions but they will be transformed to an escaped form. See also the UNISTR function.
    ## Reason given for going down this route is that our application works with SQL Server and Oracle and this was the best option
    ## to keep the code/schemas consistent between the two databases
    First, you have to keep two sets of scripts anyway because syntax of DDL is different between SQL Server and Oracle. There is therefore little benefit of just keeping the data type names the same while so many things need to be different. If I designed your system, I would use a DB-agnostic object repository and a script generator to produce either SQL Server or Oracle scripts with the appropriate data types or at least I would use some placeholder syntax to replace placeholders with appropriate data types per target system in the application installer.
    ## I don't know if it is possible to create a database in SQL Server with a Unicode characterset/collation like you can in Oracle, that would have been the better option.
    I am not an SQL Server expert but I think VARCHAR data types are restricted to Windows ANSI code pages and those do not include Unicode.
    -- Sergiusz

  • Can Oracle Bulkcopy write data within an external transaction?

    Hi Everyone,
    I am new to odp.net and I hope somebody can help me with an issue about bulkcopy. What I want to do is to make the process of a bulkcopy executed within a transaction. The transaction here should be external and when it comes to the point where bulkcopy is executed, the transation has already begun and has executed some other commands. The way how i want to do this is described as follow
    Step 1: Create a connection (e.g. dbConn)
    Step 2: Use that connection to start a transaction (e.g. dbConn.BeginTransaction)
    Step 3: Use the connection to create and execute some commands (e.g. delete/insert/update some records in a table)
    Step 4: Create an OracleBulkCopy instance with that connection (e.g. Dim copier As DBBulkCopy = New OracleBulkCopy(dbConn, OracleBulkCopyOptions.Default) )
    Step 5: Use the WriteToServer function in the bulkcopy to copy some data across tables (e.g. copier.writetoserver(datatable))
    Step 6: Commit the transaction.
    Now when i run to Step 5, the WriteToServer function, an exception is thrown. The error is "ORA-26085: direct path operation must start its own transaction"
    As far as I understand, it is telling me the transaction of the connection (dbConn in this example) has begun before the WriteToServer is executed. Howevery, putting the bulkcopy process within an transaction is exactly what I want and it is very critical to the task I am trying to finish.
    Can any body let me know how to implement this?
    Thanks very much

    I see following note in Oracle Data Provider for .NET Developer's Guide 11g Release 1 (11.1.0.7.20) -
    "Note: All bulk copy operations are agnostic of any local or distributed transaction created by the application."
    Hence it seems that Bulk Copy operations should not be used with user transactions.

  • Recovering data from "crashed" hard drive - white macbook

    I have a 13" white macbook (my daughter's macbook). It looks like the hard drive crashed. She took it to Apple store they told her they can replace the hard drive (still under Apple care), but I am trying to get the data from the hard drive.
    I pulled the hard drive out of the macbook, and put it in a external hard drive enclosure, with USB2 connector. Unfortunately I have to connect the USB to a Windows PC, since that's what I have (she took my PowerBook for school work). The wondows machine does not recognize the hard drive. I tried to connect to a Vista machine and an XP machine same problem.
    Do I have to connect it to a Mac to be able to recognize this hard drive? I thought as long as I put it in a external disk enclosure I will be able to recognize it even with Windows machine.
    I would appreciate any help.
    p.s. I do hear (faint) clicking noise from the hard drive - both when it is in the macbook and when it is in the external enclosure.
    Thanks.

    The enclosure is format "agnostic". A drive could conceivably be formatted in several different Apple formats; HFS+ (aka Mac OS Extended) is likely what you have. There's Windows based FAT32 or NTFS.
    Windows doesn't properly recognize Mac drive formats unless you add software. There are a couple of commercial tools that might work.
    http://www.macdisk.com/mden.php3
    tp://www.paragon-software.com/home/hfs-windows/

  • How can I architect my data layer to yield query result pages to the application as SQL Server prepares them?

    I tried to make the question as explicit as possible.
    Refer to Sql Server Management Studio's Results view.  Depending upon the structure of the execution plan, the Results pane may begin displaying results while the query is still executing.  Can someone point me in a direction for architecting a
    data layer (I am tech and framework agnostic for this task. Any solution will suffice) that will begin receiving pages of the set before SQL Server has completed the entire query?
    The call from the data layer to SQL Server will obviously have to be asynchronous, but is there any additional ceremony that I need to be aware of when issuing OPTION (FAST x) to the query optimizer?

    Thanks for the reply. (I actually meant to put this in the SQL Data Access forum, not the T-SQL forum)
    "Generally the last step is ORDER BY in a
    query, so nothing can start before that executes."
    I would imagine you cannot ORDER BY and yield results as they are fetched because of the execution plan that would be generated.  For the purposes of this post, please assume that sorting will be done purely client side
    "Can you post your query?"
     For purposes of discussion, let's assume that the query is
    select *
    from information_schema.columns
    and also assume that you have "lots" of columns to display.
    This was an exploratory question to see what would be necessary to replicate the behavior of Management Studio's Query Result view in a custom application. 
    I would imagine that there's going to be a lot of analysis of the execution plans that get generated in order for the OPTION (FAST x) optimizer hint to do any good, but apart from general tuning concerns that would allow SQL SERVER to yield a page of data
    "fast", I was wondering if there was anything else required of the calling client to force it to yield return its first page.
    After thinking about this (and phrasing it the way I did in the last sentence) perhaps this is the incorrect forum for this question.  I imagine that my concerns are better addressed in forums dedicated to the technology of the calling client (which
    would be a .NET assembly)
    Be that as it may, if there is any ceremony that SQL Server imposes on clients in order to yield return, I would expect that my question would be in the scope of SQL Server discussions (even though I intended this to be in a different SQL Server forum)

  • Throttled data for new heavy users??

    HowardForums had a link to a this tidbit from a Verizon site.
    Important Information about Verizon Wireless Data Plans and Features
    As part of our continuing efforts to provide the best experience to our more than 94 million customers, Verizon Wireless is introducing two new network management practices.
    We are implementing optimization and transcoding technologies in our network to transmit data files in a more efficient manner to allow available network capacity to benefit the greatest number of users. These techniques include caching less data, using less capacity, and sizing the video more appropriately for the device. The optimization process is agnostic to the content itself and to the website that provides it. While we invest much effort to avoid changing text, image, and video files in the compression process and while any change to the file is likely to be indiscernible, the optimization process may minimally impact the appearance of the file as displayed on your device. For a further, more detailed explanation of these techniques, please visit www.verizonwireless.com/vzwoptimization
    If you subscribe to a Data Plan or Feature on February 3, 2011 or after, the following applies:
    Verizon Wireless strives to provide customers the best experience when using our network, a shared resource among tens of millions of customers. To help achieve this, if you use an extraordinary amount of data and fall within the top 5% of Verizon Wireless data users we may reduce your data throughput speeds periodically for the remainder of your then current and immediately following billing cycle to ensure high quality network performance for other users at locations and times of peak demand. Our proactive management of the Verizon Wireless network is designed to ensure that the remaining 95% of data customers aren't negatively affected by the inordinate data consumption of just a few users.
    And here is a link to the optimization process

    lewisr13 wrote:
    Comcast does something very similar with their internet. They do it so users don't use the residential service for heavy business usage. Residentialservice is cheaper than business class.
    They also have a meter so you know where you stand. Also, throttling does not continue into the next month. I just feel VZW should had been more up front about this BEFORE the preordering of the iPhone.
    It will not affect me at all.

  • Mechanism for Info Object Data elements inherit source documentation?

    Say I am in ECC. I go to SE11, give table name MARA and then doubleclick on the dataelement 'MATNR'. Then I click on documentation. I get a popup with Short text " Material Number" and Definition "Alphanumeric key uniquely identifying the material'.
    I am interested in the latter information that is 'Definition' - &  in general whatever documentation comes up for data element in ECC/CRM/SCM/SRM.
    Now I log into SAP BI. I find that under characteristic 0Material, the data element is /BI0/oimaterial. When I double-click this datamaterial and press the documentation button, the system says 'No documentation'.
    My Questions:
      1. IS there a switch we could turn on in source [ECC/SRM/CRm/SRM] so that the data element in SAP BI inherits the original source field data element documentation.
    { I am not too convinced of the argument- that in BI we have info objects in ECC we have fields - since I am talking of data element level information- I would tend to think of this as an oversight of the designers [or de-prioritization] !!}.
    2. Could we have an ABAP workaround? That is, in BI we identify the tables that house the mapping between the source and destination data elementsa and take out this information. Then we extract the dataelement documentation [by function DOCU_GET] and use the mapping info above to link SAP BI data element with source data element documentation.
    WHY do I want to punish myself as above? My use case is, we take out SAP BI Table, field, metadata etc and create a model in a modeling tool and physical implementation in a 3rd party DW database as our own canonical, application-agnostic corporate datawarehouse. I would want the source data element documentation to flow to this last system as well as the modeling tool.
    Regards
    Sasanka

    go through this links:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/9214b1e5-0601-0010-fdb0-ec32d43b06e0
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/8d7cc990-0201-0010-27a3-d0f74b75a1ee

  • Error serializing non-built in data type

    hi
    I am getting the following exception while accessing a webservice which uses non-built-in
    data types as parameter to the service method
    Its called ProcessingOrderTO.
    The ProcessingOrderTO has three sub-types.. one of which is a Collection
    Does anybody encountred this before
    I have also attch the WSDL generated .
    I am using wls81 SP1
    thanks
    jas
    java.rmi.RemoteException: web service invoke failed: javax.xml.soap.SOAPException:
    failed to serialize class mypackage.orderprocessing.processing.to.ProcessingOrderTOweblogic.xml.schema.binding.SerializationException:
    mapping lookup failure. class=interface java.util.Collection class context=TypedClassContext{schemaType=['java:language_builtins.util']:Collection};
    nested exception is:
    javax.xml.soap.SOAPException: failed to serialize class mypackage.orderprocessing.processing.to.ProcessingOrderTOweblogic.xml.schema.binding.SerializationException:
    mapping lookup failure. class=interface java.util.Collection class context=TypedClassContext{schemaType=['java:language_builtins.util']:Collection}
    javax.xml.soap.SOAPException: failed to serialize class mypackage.orderprocessing.processing.to.ProcessingOrderTOweblogic.xml.schema.binding.SerializationException:
    mapping lookup failure. class=interface java.util.Collection class context=TypedClassContext{schemaType=['java:language_builtins.util']:Collection}
    at weblogic.webservice.core.DefaultPart.invokeSerializer(DefaultPart.java:328)
    at weblogic.webservice.core.DefaultPart.toXML(DefaultPart.java:297)
    at weblogic.webservice.core.DefaultMessage.toXML(DefaultMessage.java:619)
    at weblogic.webservice.core.ClientDispatcher.send(ClientDispatcher.java:206)
    at weblogic.webservice.core.ClientDispatcher.dispatch(ClientDispatcher.java:143)
    at weblogic.webservice.core.DefaultOperation.invoke(DefaultOperation.java:444)
    at weblogic.webservice.core.DefaultOperation.invoke(DefaultOperation.java:430)
    at weblogic.webservice.core.rpc.StubImpl._invoke(StubImpl.java:270)
    at mypackage.orderprocessing.processing.client.OrderProcessingAssemblerXDBeanPort_Stub.processOrder(OrderProcessingAssemblerXDBeanPort_Stub.java:45)
    at mypackage.orderprocessing.processing.client.Client.main(Client.java:69)
    [OrderProcessingAssemblerXDBean.wsdl]

    Hi Mik
    Thanks for these suggestions.I have now completely moved to specific types, instead
    of the Collections and ArrayList.
    Things seem to be working fine ....thanks for all your help
    thanks
    jas
    "Michael Wooten" <[email protected]> wrote:
    >
    Hi Jas,
    My mistake. I copied the code from a DII client and you are using a Stub
    one :-)
    The correct code is:
    TypeMappingRegistry registry = service.getTypeMappingRegistry();
    Where the service variable is what was returned from your XXX_Impl()
    call.
    You don't have to cast anything, because the object returned from that
    method
    already extends javax.xml.rpc.Service.
    w.r.t. you "moreover" text:
    As long as the definition of the non-built-in types is the same as the
    WSDL you
    sent earlier, you shouldn't have a problem. The question I have is why
    are you
    a using java.util.Collection object on a web service operation's method
    signature,
    in the first place? When you do this, you do to things:
    1. Producing WSDL that has xsd:anyType for element types. xsd:anyTypes
    are fine
    for defining XML grammars (i.e. WSDL, SOAP, etc.), but they add too much
    mystery
    to a business service. They basically tell the WSDL consumer, "the content
    of
    this element can be anything, so you'll have to figure out what it is
    when you
    get it. Hee, hee, hee!"
    2. Introducing Java-specific classes into your web service. If the extrasList
    element always contains the same type of elements, the type for it should
    be an
    array of whatever that object is. If you used an ArrayList because you
    don't know
    how many elements this will be, you can still use it in the implementation,
    and
    call the toArray() method to return a typed array.
    To me, it just seems wrong to use Java container objects (i.e. ArrayList,
    Collection,
    HashMap, Set, etc.) in a web service operation's method signature, because
    it
    goes against the "programming language agnostic" idea of web services
    :-) Just
    my two cents.
    Regards,
    Mike Wooten
    "jas" <[email protected]> wrote:
    hi Michael
    thanks for the reply ..
    I tried using the smippet below .. but instead got a ClassCast Exception
    at
    TypeMappingRegistry registry = ((Service) port).getTypeMappingRegistry();
    Moreover
    1. When i generate client-jars using the clientgen task, the definition
    of my
    non-built-in types is different ...
    2.
    thanks
    jas
    "Michael Wooten" <[email protected]> wrote:
    Hi Jas,
    From the stack trace, it looks like you are using a JAX-RPC stub client.
    Our java.util.Collection
    serializer/deserializer is assigned to the "java:language_builtins.util"
    namespace,
    so you'll need to register the ArrayList and Collection in the TypeMappingRegistry.
    Here's a code snippet:
    import java.util.ArrayList;
    import java.util.Collection;
    import javax.xml.soap.SOAPConstants;
    import javax.xml.namespace.QName;
    import javax.xml.rpc.Service;
    import javax.xml.rpc.encoding.TypeMapping;
    import javax.xml.rpc.encoding.TypeMappingRegistry;
    import weblogic.xml.schema.binding.internal.builtin.JavaUtilArrayListCodec;
    import weblogic.xml.schema.binding.internal.builtin.JavaUtilCollectionCodec;
    TypeMappingRegistry registry = ((Service)port).getTypeMappingRegistry();
    TypeMapping mapping = registry.getTypeMapping(SOAPConstants.URI_NS_SOAP_ENCODING
    mapping.register(
         ArrayList.class,
         new QName("java:language_builtins.util", "ArrayList"),
         new JavaUtilArrayListCodec(),
         new JavaUtilArrayListCodec()
    mapping.register(
         Collection.class,
         new QName("java:language_builtins.util", "Collection"),
         new JavaUtilCollectionCodec(),
         new JavaUtilCollectionCodec()
    //invoke the web service operation that uses the ArrayList and Collection
    There use to be a problem were the type mapping information in the
    MyServiceClient.jar
    (the one that the clientgen Ant task produces), was incorrect, butI
    think this
    was fixed well before WLS 8.1 :-)
    I haven't tested the above with WLS 8.1 SP 1, but I think it may fix
    the problem
    you are having.
    BTW:
    JavaUtilArrayListCodec and JavaUtilCollectionCodec are in ${WL_HOME}/server/lib/webserviceclient.jar.
    Regards,
    Mike Wooten
    "jas" <[email protected]> wrote:
    hi
    I am getting the following exception while accessing a webservice
    which
    uses non-built-in
    data types as parameter to the service method
    Its called ProcessingOrderTO.
    The ProcessingOrderTO has three sub-types.. one of which is a Collection
    Does anybody encountred this before
    I have also attch the WSDL generated .
    I am using wls81 SP1
    thanks
    jas
    java.rmi.RemoteException: web service invoke failed: javax.xml.soap.SOAPException:
    failed to serialize class mypackage.orderprocessing.processing.to.ProcessingOrderTOweblogic.xml.schema.binding.SerializationException:
    mapping lookup failure. class=interface java.util.Collection classcontext=TypedClassContext{schemaType=['java:language_builtins.util']:Collection};
    nested exception is:
    javax.xml.soap.SOAPException: failed to serialize class mypackage.orderprocessing.processing.to.ProcessingOrderTOweblogic.xml.schema.binding.SerializationException:
    mapping lookup failure. class=interface java.util.Collection classcontext=TypedClassContext{schemaType=['java:language_builtins.util']:Collection}
    javax.xml.soap.SOAPException: failed to serialize class mypackage.orderprocessing.processing.to.ProcessingOrderTOweblogic.xml.schema.binding.SerializationException:
    mapping lookup failure. class=interface java.util.Collection classcontext=TypedClassContext{schemaType=['java:language_builtins.util']:Collection}
    at weblogic.webservice.core.DefaultPart.invokeSerializer(DefaultPart.java:328)
    at weblogic.webservice.core.DefaultPart.toXML(DefaultPart.java:297)
    at weblogic.webservice.core.DefaultMessage.toXML(DefaultMessage.java:619)
    at weblogic.webservice.core.ClientDispatcher.send(ClientDispatcher.java:206)
    at weblogic.webservice.core.ClientDispatcher.dispatch(ClientDispatcher.java:143)
    at weblogic.webservice.core.DefaultOperation.invoke(DefaultOperation.java:444)
    at weblogic.webservice.core.DefaultOperation.invoke(DefaultOperation.java:430)
    at weblogic.webservice.core.rpc.StubImpl._invoke(StubImpl.java:270)
    at mypackage.orderprocessing.processing.client.OrderProcessingAssemblerXDBeanPort_Stub.processOrder(OrderProcessingAssemblerXDBeanPort_Stub.java:45)
    at mypackage.orderprocessing.processing.client.Client.main(Client.java:69)

  • LoadBatchActions XML Date Parasing

    We are using LoadBatchActions to populate a matrix with data in a single call by passing an XML document, but we have noticed that in B1 2005, date columns in the matrix seem to parse the date wrong.  When you have the date format set to DD/Month/YYYY then it expects a month name in the XML date value instead of the neutral YYYYMMDD format.  I thought XML was supposed to be a language-neutral format, and that data passed to B1 was supposed to be in a language-neutral format.  Is there an error here?

    That's the problem.  Local settings should not affect data format at this level.  I can work around it by changing the string according to the local settings, but data formats are supposed to be locale-agnostic.
    The following code demonstrates the problem.  This code should work no matter what the local settings are, but it fails when using DD/Month/YYYY format:
    [code]
    Sub CreateDateMatrix()
       Dim frm As SAPbouiCOM.Form
       Dim cp As SAPbouiCOM.FormCreationParams
       Set cp = sboApp.CreateObject(cot_FormCreationParams)
       cp.FormType = "FSE_TestDate"
       cp.BorderStyle = fbs_Sizable
       Set frm = sboApp.Forms.AddEx(cp)
       frm.Visible = True
       Dim mtxItem As SAPbouiCOM.Item
       Set mtxItem = frm.Items.Add("Matrix", it_MATRIX)
       mtxItem.Left = 10
       mtxItem.Top = 10
       mtxItem.Width = frm.ClientWidth - 20
       mtxItem.Height = frm.ClientHeight - 20
       Dim ds As SAPbouiCOM.UserDataSource
       Set ds = frm.DataSources.UserDataSources.Add("DateDS", dt_DATE)
       Dim mtx As SAPbouiCOM.Matrix
       Set mtx = mtxItem.Specific
       mtx.Columns.Add("RowMark", it_EDIT).Editable = False
       With mtx.Columns.Add("Date", it_EDIT)
          .Editable = -False
          .DataBind.SetBound True, , "DateDS"
          .Width = 100
          .TitleObject.Caption = "Date"
       End With
       Dim populate As String
       populate = "<Application><forms><action type=""update"">" & _
          "<form uid=""" & frm.UniqueID & """>"
       populate = populate & "<items><action type=""update"">" & _
          "<Item uid=""Matrix"">"
       populate = populate & "<Specific><rows><action type=""add"">"
       Const rowCount As Integer = 5
       Dim i As Integer
       For i = 1 To rowCount
          populate = populate & "<row index=""" & CStr(i) & """ />"
       Next
       populate = populate & "</action><action type=""update"">"
       Dim dateString As String
       For i = 1 To rowCount
          dateString = Format$(Now + i, "yyyyMMdd")
          populate = populate & "<row index=""" & CStr(i) & """>" & _
             "<column uid=""Date"" string=""" & dateString & """ /></row>"
       Next
       populate = populate & "</action></rows></Specific></Item></action>" & _
          "</items></form></action></forms></Application>"
       sboApp.LoadBatchActions populate
       MsgBox sboApp.GetLastBatchResults
    End Sub
    [/code]

  • Power pivot : Can we include week level data filter in timeline?

    Can we include week level data in timeline? the Timeline object takes the date and show always the same 4
    levels(Year, Quarter, Month, Date). What if I want to filter on a week
    level ?
    thanks

    If a week selection is an immediate requirement for you, you could introduce a week name column in your calendar table (YYYY WK ww) and create a slicer based on that.
    All of the built-in time intelligence in DAX and Tabular is week-agnostic in part due to the myriad rules for handling partial weeks at year-end, and year-beginning and in part due to different week-ending days in use. If this reasoning is followed, then
    it may be unlikely that we see a week selection in a timeline.

  • P2 ingested footage with wrong date created metadata

    Hello, we're trying to standarize all of the media for our systems. Part of this includes ingesting P2 footage using Prelude and encoding it as DVCHDPro quicktime files. I know that the recommended method that adobe gives for ingesting P2 footage is to keep the file structure in tact and use the media browser function of Premiere; this, however, doesn't suit our company's needs of maintaining a relativaly NLE-agnostic media library and having quicktime compatiable discrete video clips of all our media. We're using Mac OSX operating systems, by the way. 
    The problem is that, when using Prelude to ingest, it seems that Prelude obliterates the "date created" metadata and file data (e.g. when selecting the "get info" on the clip itself; that file data). After ingesting a clip, if I open it in Premiere, there is no date-based metadata that matches the original record time on the .mxf file itself. Nor when I view the clip in OSX's "get info" does the "date created" panel reflect the date that the original file was created. It seems that it should be possible to write the original "date created" data to the ingested file (in the ingest panel in Prelude I can see the correct "date created" time); FCPX is able to do this when ingesting P2 media using their systems. Is there an option that I'm missing or is this simply not possible with this version of Prelude?
    To illustrate: here's a screencap of the ingest panel in prelude, notice the "date created" column reads 2013-06-28T11:43:05; which is the correct record date of that clip:
    However, after I ingest the clip, both within Premiere and the "get info" panel in OSX, the "date created" now refers only to my import time:

    Thanks both for your responses. I checked the .mxf file "date created" and it is indeed empty (see picture).
    The "date modified time" however, is correct. I do understand that when dragging media directly from the media browser in premiere that the metadata displays correctly; what I think should also happen is that when the footage is ingested and transcoded via Prelude, it should also display correct metadata. As shown in my pictures above, prelude does indeed see the correct "date created" (wherever it gets that information); it just obliterates it after it transcodes the clip; it's not available either in the "get info data" (which, somehow, FCPX does do when importing P2 media through their system) nor (most strangely) in premiere. Any settings or changes that could be made to address this would be appreciated.

  • How to Use SOAPArray to Exchanged Data with a Web Service

    The method of a prototype Web service I created is defined to take many parameters
    and return an object of a user defined class. Furthermore, the user defined class
    includes data elements of another user defined class and the Java ArrayList class.
    This works with a Java client referencing the WebLogic created client.jar file
    but I don't know how well it will work with a non-Java client. In particular,
    with Perl which is the language that will be used by the developer who first will
    test with the prototype.
    In posts to this newsgroup use of "language-specific, generic containers" has
    been discouraged and the "language-agnostic" SOAPArray recommended. I have searched
    this newgsroup and the Web for examples of how to use a SOAPArray in a Web service
    EJB to receive parameters and return results but found none.
    Will someone refer me to an example or give an overview of how a Java Web service
    EJB running in WebLogic 6.1 would use SOAPArray to get parameter values and return
    results?
    Also, I would like confirmation that it is best to use SOAPArray to exchange data
    with a Web service to achieve the goal of a service accessible by any language.
    Thank you.

    Replies in-line:
    How are the structures, e.g. gltrans-workType, defined in the Web service?The structure is made up of nested Java Beans, but this does not mean that the
    client for your web service has to be written in Java. The WSDL that I sent contains
    everything that a .NET-based (or Perl-based, or Python-based, or VB-based, or
    C++ based) Web Service Stack needs to correctly create all the data types in the
    web services' signature! That's the beauty of XML Schema! It's programming language
    independent :-)
    In
    other words, what definition in Java resulted in the WSDL statements?The WSDL wasn't produced by WLS 6.1, but it (WLS 6.1) can consume it.
    What is the signature of method submitGLTransWorkAsJavaBean() in the
    Web service?public void submitGLTransWorkAsJavaBean(GlTransactionsCpyType glTransactionsCpyType)
    GlTransactionsCpyType is the outer-most Java Bean. WLS 6.1 does not generate
    Java Beans for you, but it will use ones that you defined. See the Java Bean tutorial
    on the Javasoft sitem for details on how to create a Java Bean.
    Was the WSDL generated using the WL tools for creating a Web service?No.
    Conclusion:
    You asked for someone to provide you with an example of how to use SOAP array
    in a WSDL, which is what the attached file contained :-) What you want to do now
    is find a tool that can generate Java Bean code from this WSDL (Apache Axis has
    a wsdl2java tool that should work), or create the Java Beans yourself. Afterwards,
    create a WLS 6.1 Web Service a expose it for a Perl or .NET client.
    Regards,
    Mike Wooten
    "Jeff Carey" <[email protected]> wrote:
    >
    Please elaborate.
    How are the structures, e.g. gltrans-workType, defined in the Web service?
    In
    other words, what definition in Java resulted in the WSDL statements?
    What is the signature of method submitGLTransWorkAsJavaBean() in the
    Web service?
    Was the WSDL generated using the WL tools for creating a Web service?
    Thank you.
    "Michael Wooten" <[email protected]> wrote:
    Hi Jeff,
    Sounds like a pretty cool prototype :-)
    I have attached a WSDL (at the bottom of this post) that contains a<schema>
    that
    uses a SOAPArray to create an array of a <complexType>.
    HTH,
    Mike Wooten
    "Jeff Carey" <[email protected]> wrote:
    The method of a prototype Web service I created is defined to take
    many
    parameters
    and return an object of a user defined class. Furthermore, the user
    defined class
    includes data elements of another user defined class and the Java ArrayList
    class.
    This works with a Java client referencing the WebLogic created client.jar
    file
    but I don't know how well it will work with a non-Java client. Inparticular,
    with Perl which is the language that will be used by the developerwho
    first will
    test with the prototype.
    In posts to this newsgroup use of "language-specific, generic containers"
    has
    been discouraged and the "language-agnostic" SOAPArray recommended.
    I have searched
    this newgsroup and the Web for examples of how to use a SOAPArray in
    a Web service
    EJB to receive parameters and return results but found none.
    Will someone refer me to an example or give an overview of how a Java
    Web service
    EJB running in WebLogic 6.1 would use SOAPArray to get parameter values
    and return
    results?
    Also, I would like confirmation that it is best to use SOAPArray toexchange
    data
    with a Web service to achieve the goal of a service accessible by any
    language.
    Thank you.

  • Standard application configuration and data paths on Linux

    Hi,
    I have some problem with choosing proper place for application global configuration and data paths on Linux.
    I saw following paths for application configuration:
    /etc/app_name
    /etc/xdg/app_name
    /usr/share/app_name
    /usr/local/share/app_name
    /opt/app_name
    and following for application data:
    /usr/share/app_name
    /usr/local/share/app_name
    /opt/app_name
    Which directories are standard and distribution independent?
    best regards,
    Lukasz
    Last edited by lgro (2012-02-16 20:46:23)

    Wouldn't environment variables like XDG_DATA_HOME, XDG_CONFIG_DIRS, XDG_CONFIG_HOME, etc be best?
    Many languages' standard libraries have functions for accessing these effeciently in a distro-agnostic manner.

Maybe you are looking for

  • Database - Show SQL Query question..

    Greetings:     I am helping a co-worker with Crystal. I don't use crystal but I am able to explain the problem. My co worker is currently using crystal reports 8.5. If he pulls a query and the report is generated he can then click on Database menu th

  • TFS2013 Custom Alert Email Issue

    Hi Folks, I need to create an alert for when a new work item is created on a team project.  I need the alert to work at the team level, however, instead of sending emails to the entire team I need them to go to just two members. When I try and create

  • AirPlay without network

    I want to take my Apple TV on vacation with me to stream movies from my MacBook Pro to a TV. Is it possible to stream video directly from the MacBook Pro without connecting both it and the Apple TV to a separate wifi network? Wouldn't I be able to se

  • SATA Locks on 3 & 4 ??!!

    I have read many posts about this and the prevailing belief is that SATA 1 & 2 is not locked but 3 & 4 are. WHen I set my FSB at anythjing marginally above 200, I can't boot with the drives in 3 & 4 but I can boot up to FSB 235 om SATA 1&2, which is

  • Immerse Style Issue

    I am curious if anyone else has had this issue with any of the themes in SharePoint. I would really like to either expand what appears to be the "ms-comm-forumContainer" so it will look right or prevent text from being able to pop out of the containe