Issue with using formatted date value column in order by clause...

Hi,
1) Through a function I am listing all the months in a year.
like JAN-2007, FEB-2007,MAR-2007......
2) i am comparing these values against a date value column in a table.
And if there are no values in a particular period it will return a null value (simply to say, i am using an outer join)
3) The issue.....
When join both the queries then the order of the dates is not mainted .....
My requirement is......
Jan-2007 = 3
Feb-2007 = 5
Mar-2007 = null
etc.....
should be acheived.
But I end up in the alphabetical order of the months....
like
Apr-2007 = 5
Aug-2007 = 10
etc.....
Can anybody let me know how can i acheive my resultant output to be ordered by the date and not by character.
When I use the date column in the order by I run into an error....
saying....
ORA-01858: a non-numeric character was found where a numeric was expected
Its a kind of urgent...
Any help is appreciated....
KK

When I use a outer join as follows....
"and upper(to_char(a.task_planned_start_date,'mon-rrrr')) = d.period_name(+)"
All the null periods are ending at the bottom.It's standard behavoiur.
Seems, you need to sort by a.task_planned_start_date.
Look below:
SQL> select e.ename, ec.ename from emp e, emp_copy ec
  2  where e.empno = ec.empno(+);
ENAME      ENAME
SMITH      SMITH
ALLEN
WARD       WARD
JONES
MARTIN     MARTIN
BLAKE
CLARK      CLARK
SCOTT
KING       KING
TURNER
ADAMS      ADAMS
JAMES
FORD       FORD
MILLER
14 rows selected.
SQL> select e.ename, ec.ename from emp e, emp_copy ec
  2  where e.empno = ec.empno(+)
  3  order by ec.ename
  4  /
ENAME      ENAME
ADAMS      ADAMS
CLARK      CLARK
FORD       FORD
KING       KING
MARTIN     MARTIN
SMITH      SMITH
WARD       WARD
JAMES
TURNER
ALLEN
MILLER
BLAKE
JONES
SCOTT
14 rows selected.
SQL> select e.ename, ec.ename from emp e, emp_copy ec
  2  where e.empno = ec.empno(+)
  3  order by e.ename
  4  /
ENAME      ENAME
ADAMS      ADAMS
ALLEN
BLAKE
CLARK      CLARK
FORD       FORD
JAMES
JONES
KING       KING
MARTIN     MARTIN
MILLER
SCOTT
SMITH      SMITH
TURNER
WARD       WARD
14 rows selected.Rgds.

Similar Messages

  • Issue with the Posting Date of the Purchase Order.

    Hi All,
    There are fields in BW like SSL1: Time OK, SSL2: Qty OK, SSL3: Time & Qty Ok, SSL4: Days Late (Routines are written to calculate). These fields will indicate whether the delivery against a GR is OK or not with respect to Time, Quantity and the No. of Days..
    But here the issue I am facing is
    If there is only1 delivery/ GR against a single item the calculation in BW are correct - i.e. for a particular PO if there is only one delivery the above fields like SSL1: Time Ok, SSL2: Qty OK will show like the delivery is done within the specified time and everything is OK (in case if it is delivered within the allotted time)
    But if there are multiple deliveries or multiple GR's  posted for one PO item, the calculations are going wrong i.e. even if the delivery is done well within the specified time it is showing the wrong calculations like it is delievered too late. Because in this case the earlier dates are overwritten.
    Can anyone throw me some light on how can I go about solving this issue.
    I am thinking of declaring the Posting Date as the KeyField of the DSO as of now it is a Data field  I also want to know the impact of assigning this as a Keyfield.
    Thanks in advance,
    Prasapbi

    Hi,
    As I understand, you have a DSO based on Purchase Order and your key field is PO and its line item. The problem as you stated will always be there if you have multiple deliveries/GRs created for a single line item because the system will overwrite the entries for same key.
    Problem with adding Posting date as keyfield will be that then your key will be PO-PO Lineitem-Date. When PO will be created, the Posting date will be blank(correct me here if I am wrong), therefore you will have one entry for same PO-line item combination. One without date and other with date, which again would be incorrect. If my assumption about Posting date was wrong, even then your data may not be correct because then you may have many entries with same posting date which again would overwrite each other.
    If there is any direct link between PO line item and number of deliveries that will get created for them, then you can bring that field in DSO as keyfield. But I don't think there is any such field.
    Looking at your report requirement, I would suggest that you make a DSO based on Goods Receipts and then calculate these keyfigures by comparing the dates between GR posting date and PO line item date.
    Else you can change the way your datasource works(if its generic one based on function module). Since your main requirement is to check whether the GR posting date has met your SLA or not, you should fetch all the details only when GR is created and make your key field as PO-PO Line item-GR

  • Issue with using a multi-valued attribute in calculation

    Hi,
    I have the following two attributes for some calculation:
    QTY (mdex:long-set)     VOLUME (mdex:long)
    1^2^3^4^7                2
    3                        4
    I want to calculate the total volume with (MAX value of QTY in the set) * VOLUME.
    It seems there is no built in function to calculate the MAX value from a set. So, for now, I am trying to calculate the same using ARB with the following qry:
    DEFINE "myVW" AS SELECT
    ARB(CASE WHEN IS_EMPTY("QTY") THEN { 0 } ELSE "QTY" END) AS "Quantity",  /* QTY can be NULL */
    ARB("TOTAL_PART_VOLUME") AS "UnitVolume"  /* ARB to prevent "Source attribute must be aggregated" error */
    "UnitVolume" * "Quantity" AS "TotalVolume"
    FROM BASE_VIEW
    GROUP BY "SOME_OTHER_COLUMN"
    I am getting the folloing error: Cannot multiply mdex:long and mdex:long-set. I am not sure why "Quantity" is a multivalued field.
    Is the above calculation possible at all? Or a better approach would be to calculate TotalVolume using a transformer in the integrator during data load?
    Thanks in advance. I am using endeca version 7.6.1.
    daich

    Hi Daich,
    Doing an ARB (or nearly any set based operation) on a SET will still return you a SET.  In your case, it's a Set of one (which might be the least useful thing there is).
    The only way you can really go from Sets to Members of Sets (which should be non-set types) is to do a GROUP BY MEMBERS(QTY).  You could then do that plus some EQL gymnastics and get back to taking the max off of the grouped attribute.
    Depending on your data volumes, this EQL could perform sub-optimally.  You may want to consider adjusting your data model to account for this by having single-assign and multi-assign versions of the same attribute.  It's a little tough to say if that's an option given your use case without more info though.
    Regards,
    Patrick Rafferty
    Branchbird.

  • XML Parse issues when using Network Data Model LOD with Springframework 3

    Hello,
    I am having issues with using using NDM in conjuction with Spring 3. The problem is that there is a dependency on the ConfigManager class in that it has to use Oracle's xml parser from xmlparserv2.jar, and this parser seems to have a history of problems with parsing Spring schemas.
    My setup is as follows:
    Spring Version: 3.0.1
    Oracle: 11GR2 and corresponding spatial libraries
    Note that when using the xerces parser, there is no issue here. It only while using Oracle's specific parser which appears to be hard-coded into the ConfigManager. Spring fortunately offers a workaround, where I can force it to use a specific parser when loading the spring configuration as follows:
    -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl But this is an extra deployment task we'd rather not have. Note that this issue has been brought up before in relation to OC4J. See the following link:
    How to change the defaut xmlparser on OC4J Standalone 10.1.3.4 for Spring 3
    My question is, is there any other way to configure LOD where it won't have the dependency on the oracle parser?
    Also, fyi, here is the exception that is occurring as well as the header for my spring file.
    org.springframework.beans.factory.xml.XmlBeanDefinitionStoreException:
    Line 11 in XML document from URL [file:/C:/projects/lrs_network_domain/service/target/classes/META-INF/spring.xml] is invalid;
    nested exception is oracle.xml.parser.schema.XSDException: Duplicated definition for: 'identifiedType'
         at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:396)
         [snip]
         ... 31 more
    Caused by: oracle.xml.parser.schema.XSDException: Duplicated definition for: 'identifiedType'
         at oracle.xml.parser.v2.XMLError.flushErrorHandler(XMLError.java:425)
         at oracle.xml.parser.v2.XMLError.flushErrors1(XMLError.java:287)
         at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingParser.java:331)
         at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:222)
         at oracle.xml.jaxp.JXDocumentBuilder.parse(JXDocumentBuilder.java:155)
         at org.springframework.beans.factory.xml.DefaultDocumentLoader.loadDocument(DefaultDocumentLoader.java:75)
         at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:388)Here is my the header for my spring configuration file:
    <?xml version="1.0" encoding="UTF-8"?>
    <beans xmlns="http://www.springframework.org/schema/beans"
           xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
           xmlns:aop="http://www.springframework.org/schema/aop"
           xmlns:tx="http://www.springframework.org/schema/tx"
           xmlns:context="http://www.springframework.org/schema/context"
           xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
           http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop.xsd
           http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx.xsd
           http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd">Thanks, Tom

    I ran into this exact issue while trying to get hibernate and spring working with an oracle XMLType column, and found a better solution than to use JVM arguments as you mentioned.
    Why is it happening?
    The xmlparserv2.jar uses the JAR Services API (Service Provider Mechanism) to change the default javax.xml classes used for the SAXParserFactory, DocumentBuilderFactory and TransformerFactory.
    How did it happen?
    The javax.xml.parsers.FactoryFinder looks for custom implementations by checking for, in this order, environment variables, %JAVA_HOME%/lib/jaxp.properties, then for config files under META-INF/services on the classpath, before using the default implementations included with the JDK (com.sun.org.*).
    Inside xmlparserv2.jar exists a META-INF/services directory, which the javax.xml.parsers.FactoryFinder class picks up and uses:
    META-INF/services/javax.xml.parsers.DocumentBuilderFactory (which defines oracle.xml.jaxp.JXDocumentBuilderFactory as the default)
    META-INF/services/javax.xml.parsers.SAXParserFactory (which defines oracle.xml.jaxp.JXSAXParserFactory as the default)
    META-INF/services/javax.xml.transform.TransformerFactory (which defines oracle.xml.jaxp.JXSAXTransformerFactory as the default)
    Solution?
    Switch all 3 back, otherwise you'll see weird errors.  javax.xml.parsers.* fix the visible errors, while the javax.xml.transform.* fixes more subtle XML parsing (in my case, with apache commons configuration reading/writing).
    QUICK SOLUTION to solve the application server startup errors:
    JVM Arguments (not great)
    To override the changes made by xmlparserv2.jar, add the following JVM properties to your application server startup arguments.  The java.xml.parsers.FactoryFinder logic will check environment variables first.
    -Djavax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl -Djavax.xml.transform.TransformerFactory=com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl
    However, if you run test cases using @RunWith(SpringJUnit4ClassRunner.class) or similar, you will still experience the error.
    BETTER SOLUTION to the application server startup errors AND test case errors:
    Option 1: Use JVM arguments for the app server and @BeforeClass statements for your test cases.
    System.setProperty("javax.xml.parsers.DocumentBuilderFactory","com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl");
    System.setProperty("javax.xml.parsers.SAXParserFactory","com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
    System.setProperty("javax.xml.transform.TransformerFactory","com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl");
    If you have a lot of test cases, this becomes painful.
    Option 2: Create your own Service Provider definition files in the compile/runtime classpath for your project, which will override those included in xmlparserv2.jar.
    In a maven spring project, override the xmlparserv2.jar settings by creating the following files in the %PROJECT_HOME%/src/main/resources directory:
    %PROJECT_HOME%/src/main/resources/META-INF/services/javax.xml.parsers.DocumentBuilderFactory (which defines com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl as the default)
    %PROJECT_HOME%/src/main/resources/META-INF/services/javax.xml.parsers.SAXParserFactory (which defines com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl as the default)
    %PROJECT_HOME%/src/main/resources/META-INF/services/javax.xml.transform.TransformerFactory (which defines com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl as the default)
    These files are referenced by both the application server (no JVM arguments required), and solves any unit test issues without requiring any code changes.
    This is a snippet of my longer solution for how to get hibernate and spring to work with an oracle XMLType column, found on stackoverflow.

  • CSV to Excel 2013 - Issue with number format

    Hi,
    I use Excel 2013 to manipulate CSV files. I experienced issue with number format cells. In my CSV file there is one column that presents latitude values in format: 52.05456464. When I open my CSV file in Excel 2013 all values in that column get separator
    ".", in this case my number is present like this: 52.054.454.464. I tried to change my cell format, and when opened the category was Number, I change it to the General but in that case I lost all "." and got 52054454464.
    I also tried to open same file on other machine with Excel 2010 and file is opened correctly without issues and in cell format the category is general by default.
    I hope that there is some kind of resolution for this.

    Hi
    According to your description, we may follow these steps:
    highlight the column B > Home Tab > Number section > Select Number and choose "More Number Format" > Make sure the Negative number is chosen correctly. If you are using Custom format, we will have to change it.
    Hope it helps
    Best regards

  • Issue with printing of Currency Value.

    Hi,
    I have an issue with printing of Currency Value.
    ( Pleae let me know if any thread available already  I searched it before posting but not found).
    The Problem is as follows:
    I have a field  s_value TYPE  MARM-SALK3.  (which is of currency type).
    The value of field is actually 1664.1450.   Currency USD
    But when I print this variable in ALV report it prints as  166414.50  though I used the same data type. (SALK3).
    Pleaes help me to fix this.
    Regards,
    Venkat

    Hi,
    If you use ALV to display currency value, you must sure that currency amount links with currency unit. You can archive this by using ALV fieldcat:
    wa_fieldcat-fieldname = 'SALK3'.
    wa_fieldcat-tabname = 'ITAB'.
    wa_fieldcat-cfieldname = 'WAERS'.
    wa_fieldcat-ctabname = 'ITAB'.
    append wa_fieldcat to gt_fieldcat.
    Please check,
    Regards,

  • Issue with Ord Start Date

    Hi,
    I am having issue with Ord Start Date.
    When i create XYZ Asset now when i do the acquasation for that new asset. Example: Using FB01 i enter todays date and i do the posting.
    Now when i go to DEP Area TAB for the DEP Key (M200) - The Ord Dep Start Date is 15 Dec 2010.
    For this DEP Key Period Control Methods (T code - AFAMP) - Acquisation Entered is ( 6 that is - at the start of year) and the Period Control is ( T code - OAVS) (4 that is - First year convention at half year start date).
    Our client is using the fiscal year as 1 June 2010 to 31 May 2011.
    So when i do acquistion for the asset as per the period contol the Ord Dep Start Date should be 1 Dec 2010, but here it is talking 15 Dec 2010.
    Why is system talking 15 days more.
    Regards.

    Hi
    Which Fiscal year variant are you using??
    Did you maintain it in OAVH? Copy the existing entry in OAVH for K4 to your Fisc Yr var and try again
    And also, 04 does not figure for Acquisitions in AFAMP under 0004.... Whatever you specify in OAVH must also be a part of AFAMP
    Regards
    Ajay M
    Edited by: Ajay Maheshwari on Oct 28, 2010 2:31 PM

  • Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013

    Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013 
    Newform.aspx of list:-
    Custom List is not saving data sometimes in the new form after 15 minutes and only blank entry record got created without saving data, even though some columns are mandatory fields?

    Hello dcakumar,
    Sounds like a strang issue. If you can reproduce this can you see some errors in the ULS logs?
    - Dennis | Netherlands | Blog |
    Twitter

  • Performance issue with using buffering in a APPL0 or APPL1 Table

    Hi,
    Can anyone please tell me whether there's any serious performace issue with using buffering for a Master or Transaction table? I'm asking this because when I run code inspector for my transp table I'm getting information message:
    Message Code 0011 ==> Buffereing is Activated but Delivery Class Is "A" and Message Code 0014 ==> Buffereing is Activated but Data Class Is "APPL1".
    So what's other way round for improving performance.
    Thanks,
    Mahesh M.S.

    Hi,
    have you read the documentation?
    Let me paste it here for you:
    Buffering is switched on for the examined table and it has data type 'APPL0' or 'APPL1'.
    Tables with data type 'APPL0' or 'APPL1' should contain master or transaction data, so these tables should either contain a large amount of data or their content should change frequently. Therefore buffering the table is unfavourable. Very large tables suppress other tables in the buffer memory and hence slow done any access to them. Transaction data should not be buffered because the synchronization of the changes on the various application servers is very time consuming.
    In exceptional cases, small master data tables ('APPL0', size category 0) can be buffered.
    The solution depends on the table content. If it is master or transaction data, the table should not be buffered. If the table content does not consist of master or transaction data, the data type should be corrected accordingly.
    This should answer your questions...
    Kind regards,
    Hermann

  • Any issues with using LDAP on LINUX for GRC 5.2 UME?

    Our company is converting our LDAP servers from AIX to LINUX.  The DNS name used in our UME connection should not change.  Are there any issues with using LDAP on LINUX?  We are currently on GRC 5.2 SP9 (in the middle of upgrading to SP12).
    Also, I have been trying to connect our test UME system to a test LDAP box that has already been converted to LINUX but keep getting a 'connection failed' error when I try to test it. 
    Do you have to reboot the server to test changing the LDAP connections?  I've been trying it by going into UME, pulling up the LDAP tab, hitting the Modify button, entering the new userid and password for test LDAP, and hitting the Test Connection button.  I've verified that this userid and password is correct for test LDAP.
    Is there a way to get more information about why the connection failed?
    Thanks.

    I've been told by our LDAP Support group that none of the other configuration settings should have to be changed.  I should only have to change the id and password to connect to a test version of LDAP instead of our regular connection to the production LDAP.
    Can you test a connection for a different userid/password without having to reboot/restart the server?  Do I need to change these two settings, save then, reboot/restart, and then do the Test Connection button?
    Thanks.

  • Issues with using relative links in Captivate 8

    Is anyone else having issues with using relative links in Captivate 8?  These links all used to work in the previous version of Captivate. And I could have sworn this was fixed already once in Captivate 8 but it's popping up again for us. Here is the situation... We have courses that are made up of multiple lessons which as separate Captivate files. Within those lessons are buttons to link to external documents (which live in a shared document folder), demonstrations, etc.  We use relative links because we post these to our amazon servers and we also sell them to clients where they can post them on their own web servers or in their LMS.  SO we can't put in full paths for the links or we'd have to change them constantly.  So an example is that the link for a button might be "../Document/nameofdoc.pdf"  This would be going to a user guide or something that is posted in the "Document" folder that lives at the same level as the lesson's folder. But now, all of the sudden, none of our bazillion links is working. And I've tried buttons, hyperlinks, and even the old click box. Nothing works with relative links. And I did check the permissions on every file and folder on our Amazon server to verify nothing changed there as well.   Any suggestions?

    I have the same issue with relative links using Captivate 8.  I am trying to load Captivate modules into an LMS using relative links to document files within the LMS.  The links work fine during a site page test so not an issue in the LMS, but from the Captivate module they aren't working....
    Help?

  • Issues with using the output redirection character with newer NXOS versions?

    Has anyone seen any issues with using the output redirection character with newer NXOS versions?
    Am receiving "Error 0x40870004 while copying."
    Simply copying a file from bootflash to tftp is ok.
    This occurs for both 3CDaemon and Tftpd32 softwares.
    Have tried it on multiple switches - same issue.
    Any known bugs?
    thanks!
    The following is an example of bad (NXOS4.1.1b) and good (SANOS3.2.1a)
    MDS2# sho ver | inc system
      system:    version 4.1(1b)
      system image file is:    bootflash:///m9200-s2ek9-mz.4.1.1b.bin
      system compile time:     10/7/2008 13:00:00 [10/11/2008 09:52:55]
    MDS2# sh int br > tftp://10.73.54.194
    Trying to connect to tftp server......
    Connection to server Established. Copying Started.....
    TFTP put operation failed:Access violation
    Error 0x40870004 while copying tftp://10.73.54.194/
    MDS2# copy bootflash:cpu_logfile tftp://10.73.54.194
    Trying to connect to tftp server......
    Connection to server Established. Copying Started.....
    |
    TFTP put operation was successful
    MDS2#
    ck-ci9216-001# sho ver | inc system
      system:    version 3.2(1a)
      system image file is:    bootflash:/m9200-ek9-mz.3.2.1a.bin
      system compile time:     9/25/2007 18:00:00 [10/06/2007 06:46:51]
    ck-ci9216-001# sh int br > tftp://10.73.54.194
    Trying to connect to tftp server......
    |
    TFTP put operation was successful

    Please check with new version of TFTPD 32 server. The error may be due to older version of TFPT server, the new version available solved this error. Files are getting uploaded with no issues.
    1. Download tftpd32b.zip from:
    http://tftpd32.jounin.net/tftpd32_download.html
    2. Copy the tftpd32b.zip file into an empty directory and extract it.
    3. Copy the file you want to transver into the directory containing tftpd32.exe.
    4. Run tftpd32.exe from that directory. The "Base Directory" field should show the path to the directory containing the file you want to transfer.
    At this point, the tftpserver is ready to begin serving files. As devices request files, the main tftpd32 window will log the requests.
    Best Regards...

  • HR ABAP: Issue with using 'nocommit' parameter on FM HR_INFOTYPE_OPERATION

    Issue with using nocommit parameter on FM HR_INFOTYPE_OPERATION:
    My client has a requirement to create the following 4 infotypes in sequence in a LUW, i.e either all are created or none is created.
    9045   (custom infotype)
    0045
    0078
    0015
    I tried to use the nocommit parameter on FM HR_INFOTYPE_OPERATION to insert the 4 infotypes
    in a nocoomit mode and then at the end I have issued
    'Commit Work', but to my surprise only I/T 0015 is created in the database and the first three (9045, 0045 and 0078) did not make it to database.
    I searched many threads on SDN but could not find a solution.
    Please let me know if there could be any solution to implement the LUW.
    YOur inputs will be appreciated.

    Hi ,
    i think u can also try with this FM HR_MAINTAIN_MASTERDATA , see its documentations.
    no commit works like a simulation mode , what u can do is  ,
    call FM for all Infotypes and collect all error msgs if any , then finally call FM for all infotypes again without passing nocommit work ( i.e space).
    regards
    prabhu

  • Are there any advantages to using a Data Value Reference for LabVIEW Classes?

    Hi
    I recently came across an example where the developer had used a data value reference for the class cluster in LabVIEW.
    What are the advantages of doing this?
    Doesn't the use of LV objects already avoid the creation of multiple copies of data thereby reducting memory usage?
    Thanks
    AD

    LabVIEW's OOP is implemented as a By Value.  This means, as Tst stated, branches in wires could mean copies in the object.  The DVR is a way to make it By Reference.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • HI. ... Anyone know of any issues with using Logic Pro music software with Mountain lion OS.. I have heard rumours in the past is that all sorted now?

    HI. ...
    Anyone know of any issues with using Logic Pro music software with Mountain lion OS..
    I have heard rumours in the past is that all sorted now?

    At least Logic Pro 9 and Logic Pro X work correctly in Mountain Lion. Furthermore, Logic Pro X requires the latest OS X Mountain Lion version

Maybe you are looking for