Dimensional measurement in Lightroom?

I work for a testing laboratory, and a very critical part of our operations is image archival.  We currently use a system called Pax-it!, which in my opinion is absolutely outdated and a miserable user experience.  I am wondering if Lightroom could be a replacement for Pax-it!, as I know its user friendliness is much better.
Along with this archival process includes permanently tying sample names, dates, clients, photographer, project numbers, etc., to the image.  I am familiar enough with Lightroom to know that I could use metadata entries and tags to keep this information together.
The piece of information I cannot find online is in regards to measurement.  Our current system is hooked up to two live-screen microscopes, where we can capture images on the adjacent screen.  I believe Lightroom allows for external sources to be viewed as a live display.  However, we often shoot these images from the microscope, enter the zoom of the microscope into the software, and then we are able to use a cursor drag function to measure line to line distances, add micron bars, and calculate geometric areas.  I am aware that Photoshop currently has this function, but I am looking for it in Lightroom.
I am aware that this is not the usual application of Lightroom, but I think it might just work if a measurement function does exist.  Any ideas?

There's nothing like that in Lightroom.
Also, Lightroom will allow tethered shooting, but only from a handful of cameras, mainly certain Nikon and Canon models. I'm reasonably sure that you can't do it from your microscope.
You can use Lightroom as a Digital Asset Management system for your images (assuming they're JPG, TIFF, PSD, or DNG), and send them to Photoshop for your dimensional work.
Hal

Similar Messages

  • BIB-9509 error when trying to create query on 7 Dimensional Measure

    PART1 of the text below -----------------------------
    Is there a limit on the ability to create a bibean query on measure that is associated with a large number of dimensions.
    Basically I am trying to create a bibean query on measure which has 7 dimensions.
    When I right mouse on my ProjectDesigner, I select 'New Query'
    The Query Wizard appears as expected, I follow all instructions on Step2 of the wizard.
    Step 2:
    A list appears of all the available measures and asscociated dimensions that can be used.
    I scroll down the list and sure enough my 7 dimensional measure is available.
    I select it and it places it the Right Box. I see a tree structure that shows me the measure
    and it's associated Dimensions and their heirarchies.
    Step 3:
    I cick on Next, then I enter the Layout page.
    The layout initally has 5 page dims, 1 across dim and 1 down dims. I can re-orientate the tiles to manipulate the layout with no problems
    Now when I press next I hit a big problem
    The wizard errors with the following problem
    BIB-9509 Oracle OLAP did not create cursor.
    oracle.express.ExpressServerExceptionError class: OLAPI
    Server error descriptions:
    DPR: Unable to create server cursor, Generic at TxsOqDefinitionManager::createCursorManager[i]Long postings are being truncated to ~1 kB at this time.

    PART 2 of the text below ------------------------------
    I re -tried all of the above with 1,2 and 3 dimensional measures and I had no problems whatsoever. I was able to complete the query and select the appropriate dimension values.
    The dimensional information is held in an Analytic workspace. I have created the correct Relational Views using OLAP TABLE and registered these views using the CWM2 packages, so they be viewed by BIBEANS/JDeveloper.
    Can anybody help me from the BIBEANS team thank you
    I have installed latest production JDEV and Bibeans 9.0.3.1035 and Oracle 9.2.0.2 with all the
    appropriate patches.
    WIN200 os with 512MB
    Can anybody help me from the BIBEANS team
    thank you
    Richard

  • Help needed in OSA

    Hi,
    When building a database in RAM,i have defined two attributes,DATE and TIMESPAN in the time dimension.I have checked the check box "Install as a variable" there.
    Now after bulding the RAM ,when i open a new report,there i can see the two attributes (DATE and TIMESPAN) in the measure list.
    Now ,the problem is if want to create a new custom measure(thru tools/maintain measure) and try to use the variable TIMESPAN in some calculation ,i don have that variable(TIMESPAN) there(in the measure list).How do i get the variable(TIMESPAN,as well as DATE) in my variable list while i want to perform calculation ubsing them,along with other base measures ?
    Can anyone help me regarding this.
    Please reply soon.
    Shimant

    I'd have to reproduce the problem to be sure, but if DATE and TIMESPAN get represented as text measures (as I think they do since they are created from attributes), what you're trying to do won't work. You can't create custom measures based on text measures (probably because you can't "add them"; they get filtered out. You would have to create them as numeric one-dimensional measures. We ran into this when we wanted to do something similar.
    Email me if you need more info.

  • EIS error compression

    Hi,
    when i load member with essbase integration services i have the following error message:
    STATUS: Applying compression to dimensione 'Measures'. IS ERROR MEMBER load terminated with error.
    Can help someone?
    thenks.

    Check this statment -
    How many DIm's in OTL
    how many Columns in fact table ?
    Do native generated SQL works fine ?
    Can u read load logs ?
    Are u sure that essbase work's fine
    - unlock database
    - are u use External auntetification ? - can u create app from eas ?
    other way to load data from fact table - use rules files and ODBC.

  • 'Select a measure:' stuck on 'Loading...' in Dashboard Designer KPI Dimensional Data Source Mapping

    [using SharePoint 2013 Enterprise SP1]
    I am trying to create a KPI in Dashboard Designer, but am getting a timeout. I have been doing this for a while on my site; this is not the first. I haven't had this problem before.
    I created a new KPI and clicked on the Data Mappings column value, which is a hyperlink, to bring up the Dimensional Data Source Mapping dialog. I switched to a Data Connection in the site I just created (DC works perfectly and can retrieve sample data).
    When I click the "Select a measure:" drop-down menu, I get the message "Loading..." and after a while (a minute? two?) a dialog pops up with:
    The request took too long to complete. SharePoint is currently unavailable or experiencing heavy traffic. Try again later.
    This is a test SP server and I'm the only one on it, so there is no load. Also as mentioned, I am able to verify the Data Connection without problem. I am not having any issue with any of my other few dozen KPIs/Data Connections. Any suggestions as to how
    to troubleshoot?

    Hi cgtyoder,
    According to your description, my understanding is that you got an error when you created a KPI in Dashboard Designer.
    Please try to recycle the PerformancePoint Services Application Pool account, compare the result.
    Please go to C:\inetpub\wwwroot\wss\VirtualDirectories\the port of web application, adjust the HttpRuntime executionTimeout for the Web Application by modifying the web.config, now PerformancePoint report stability is much better:
    <httpRuntime executionTimeout="600" maxRequestLength="51200" />
    Note: before you change the web.config file, please make a bakcup for the file.
    If this issue still exists, please go to the log file to find more information about this issue.
    I hope this helps.
    Thanks,
    Wendy
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Wendy Li
    TechNet Community Support

  • Multiplying Measures with different dimensionality

    New to Essbase but very experienced with OLAP
    My cube has 3 dimensions FCO, STAT and VALID
    1st Measure DATA1 is stored at the leaf values for FCO and STAT and at the VALIDTotal (parent) for the Valid Dim
    2nd Measure DATA2 is stored at the leaf values for FCO and VALID and at the StatTotal (parent) for the Stat Dim
    (Essentially the VALID dimension in DATA1 is irrelevant and the STAT dimension in DATA2 is irrelevant)
    When I do a simple calc of DATA1 * DATA2; no data is returned because when you match the dimensions one measure has data at the leaf level but the other doesn't.
    I need to multiply each of the leaf values of the STAT dimension in Measure DATA1 with each leaf value of the VALID dimension in Measure DATA2
    Below is an example
    Data1
    Stat1 Stat2 Stat3 StatTotal
    FCO1 10 15 5 -
    FCO2 20 25 10 -
    FCO3 30 35 15 -
    FCOT 60 75 30 -
    Data2
    Valid1 Valid2 Valid3 ValidTotal
    FCO1 1 1 -
    FCO2 1 1 -
    FCO3 1 1 1 -
    Total - - - -
    Desired Results:
    Valid1 Valid2 Valid3 Valid1 Valid2 Valid3 Valid1 Valid2 Valid3
    Stat1 Stat1 Stat1 Stat2 Stat2 Stat2 Stat3 Stat3 Stat3
    FCO1 10 10 15 15 5 5
    FCO2 20 20 25 25 10 10
    FCO3 30 30 30 35 35 35 15 15 15
    Total 60 40 50 75 50 60 30 20 25
    Does anyone have a suggestion as to what the formula would look like?
    Thanks in advance

    The forum tends to change the post and made it difficult to read, but I think I deciphered what you had (putting zeroes in instead of leaving them blank would have helped with the Data2 section).
    BTW: Here is how I decoded the data sets you provided (with slight name changes):
         Stat1     Stat2     Stat3     StatT          Valid1     Valid2     Valid3     ValidT
    FCO1     10     15     5               1     1     0     
    FCO2     20     25     10               1     0     1     
    FCO3     30     35     15               1     1     1     
    FCOT     60     75     30                              
         Valid1     Valid2     Valid3     Valid1     Valid2     Valid3     Valid1     Valid2     Valid3
         Stat1     Stat1     Stat1     Stat2     Stat2     Stat2     Stat3     Stat3     Stat3
    FCO1     10     10     0     15     15     0     5     5     0
    FCO2     20     0     20     25     0     25     10     0     10
    FCO3     30     30     30     35     35     35     15     15     15
    FCOT     60     40     50     75     50     60     30     20     25For future reference, you can copy/paste excel data inside a "code" tag to show it like I did above.
    Here is what I would do, assuming I interpreted the need correctly as above:
    1) Create a non-aggregating Stat_Input member (in the Valid dimension) to load the first data set.
    2) Create a non-aggregating Valid_Input member (in the Stat dimension) to load the second data set.
    3) Load the first data set to the new Stat_Input member of the Valid dimension.
    4) Load the second data set to the new Valid_Input member of the Stat dimension.
    5) Use a calc script to spread the input member to the Stat1/Stat2/State3 members based on Valid1/Valid2/Valid3 values
    - Stat1 = Stat_Input * Valid1->Valid_Input;
    - Stat2 = Stat_Input * Valid2->Valid_Input;
    - Stat3 = Stat_Input * Valid3->Valid_Input;
    It may seem confusing to have the Valid_Input in the stat dimension (and vice versa), the point is that this is where the inputs for the respective data goes, and then they are spread to the intersections based on cross-dimensional references that "link" the inputs to the other members. If you were to give the members names based on the current dimension, it would be more confusing because they wouldn't represent the purpose that they serve.
    I realize this is likely to generate more questions, but if you can give this approach a try you may find that it isn't as difficult as it seems.

  • Loading measures to a cube

    hi
    How to load measures to a cube.
    I created a cube, validated and deployed successfully.
    I am loading only a single measure from a table, but it does not load anything.
    even the keys of the dimensions are not in the database table.
    do i need to map on the measure to the proper row in the cube or i also need the keys to be mapped?
    regards
    Arif
    P.S. I am unable to include a screen dump in the forum, how to do it?
    Edited by: badwanpk on Nov 4, 2008 2:36 PM

    Yes, you need to build a row for insertion to the fact by joining your source tables.
    At the moment you are building a Data Mart/Star Schema (fact and dimension tables i.e. a cube), this Star Schema will be used for reporting i.e. no need to go back to the source. Ultimately, this could grow into a Data Warehouse that has dimensions and facts populated from many sources allowing reporting across your whole organisation rather than a single application. The dimensions are likely to contain far more descriptive i.e. text attributes than the source systems and will be used to constrain your reporting queries.
    I suggest reading some articles/books by Ralph Kimbal, they should give you a good overview of Data Warehousing and Dimensional modelling.
    http://www.kimballgroup.com/html/articles.html
    Si

  • Photoshop CC 2014 crashes when trying to create a panorama from Lightroom?

    Hi,
    I have both Photoshop and Lightroom CC installed.
    When I pass 6 photos from Lightroom to Photoshop CC to create a panorma, Photoshop goes through the throws, imports the images, generates the panorama then crashes, with a message from Windows saying it is checking for a solution to the problem, then dies.
    I have reviewed the posts, and found one that suggest to increase the cache level to 6... tried this and it still dies.
    Found another workaround to have Photoshop x64 running and pass the images to it rather than have Lr fire up Ps CC. This worked and produced the panorama.
    I am not sure why, but with the use of the CC app, I know have three Ps's on my my machine, x32, x64 and CC..., not even sure what the difference between x64 and CC... ???...
    So my question is...
    Is there are a problem with PS CC?... there certainly was with the focus stacking algorithms.. or is it just me.?.. ;(
    Thanks for any help with this...
    Cheers
    Keith Friend.
    the techy details from PS CC 2014 are as follows, this is the version that Lr fires up..
    Adobe Photoshop Version: 2014.2.2 20141204.r.310 2014/12/04:23:59:59 CL 994532  x64
    Operating System: Windows 7 64-bit
    Version: 6.1 Service Pack 1
    System architecture: Intel CPU Family:6, Model:10, Stepping:9 with MMX, SSE Integer, SSE FP, SSE2, SSE3, SSE4.1, SSE4.2, AVX, HyperThreading
    Physical processor count: 4
    Logical processor count: 8
    Processor speed: 2294 MHz
    Built-in memory: 8087 MB
    Free memory: 4027 MB
    Memory available to Photoshop: 7067 MB
    Memory used by Photoshop: 69 %
    3D Multitone Printing: Disabled.
    Windows 2x UI: Disabled.
    Highbeam: Enabled.
    Image tile size: 128K
    Image cache levels: 6
    Font Preview: Medium
    TextComposer: Latin
    Display: 1
    Display Bounds: top=0, left=0, bottom=1080, right=1920
    Display: 2
    Display Bounds: top=-120, left=1920, bottom=1080, right=3840
    OpenGL Drawing: Enabled.
    OpenGL Allow Old GPUs: Not Detected.
    OpenGL Drawing Mode: Advanced
    OpenGL Allow Normal Mode: True.
    OpenGL Allow Advanced Mode: True.
    AIFCoreInitialized=1
    AIFOGLInitialized=1
    OGLContextCreated=1
    NumGLGPUs=1
    NumCLGPUs=2
    glgpu[0].GLVersion="3.0"
    glgpu[0].GLMemoryMB=2112
    glgpu[0].GLName="Intel(R) HD Graphics 4000"
    glgpu[0].GLVendor="Intel"
    glgpu[0].GLVendorID=32902
    glgpu[0].GLDriverVersion="8.15.10.2712"
    glgpu[0].GLRectTextureSize=8192
    glgpu[0].GLRenderer="Intel(R) HD Graphics 4000"
    glgpu[0].GLRendererID=358
    glgpu[0].HasGLNPOTSupport=1
    glgpu[0].GLDriver="igdumd64.dll,igd10umd64.dll,igd10umd64.dll,igdumd32,igd10umd32,igd10umd 32"
    glgpu[0].GLDriverDate="20120326000000.000000-000"
    glgpu[0].CanCompileProgramGLSL=1
    glgpu[0].GLFrameBufferOK=1
    glgpu[0].glGetString[GL_SHADING_LANGUAGE_VERSION]="1.30 - Intel Build 8.15.10.2712"
    glgpu[0].glGetProgramivARB[GL_FRAGMENT_PROGRAM_ARB][GL_MAX_PROGRAM_INSTRUCTIONS_ARB]=[1447 ]
    glgpu[0].glGetIntegerv[GL_MAX_TEXTURE_UNITS]=[8]
    glgpu[0].glGetIntegerv[GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS]=[16]
    glgpu[0].glGetIntegerv[GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS]=[16]
    glgpu[0].glGetIntegerv[GL_MAX_TEXTURE_IMAGE_UNITS]=[16]
    glgpu[0].glGetIntegerv[GL_MAX_DRAW_BUFFERS]=[8]
    glgpu[0].glGetIntegerv[GL_MAX_VERTEX_UNIFORM_COMPONENTS]=[1024]
    glgpu[0].glGetIntegerv[GL_MAX_FRAGMENT_UNIFORM_COMPONENTS]=[1024]
    glgpu[0].glGetIntegerv[GL_MAX_VARYING_FLOATS]=[41]
    glgpu[0].glGetIntegerv[GL_MAX_VERTEX_ATTRIBS]=[16]
    glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_PROGRAM]=1
    glgpu[0].extension[AIF::OGL::GL_ARB_FRAGMENT_PROGRAM]=1
    glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_SHADER]=1
    glgpu[0].extension[AIF::OGL::GL_ARB_FRAGMENT_SHADER]=1
    glgpu[0].extension[AIF::OGL::GL_EXT_FRAMEBUFFER_OBJECT]=1
    glgpu[0].extension[AIF::OGL::GL_ARB_TEXTURE_RECTANGLE]=1
    glgpu[0].extension[AIF::OGL::GL_ARB_TEXTURE_FLOAT]=1
    glgpu[0].extension[AIF::OGL::GL_ARB_OCCLUSION_QUERY]=1
    glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_BUFFER_OBJECT]=1
    glgpu[0].extension[AIF::OGL::GL_ARB_SHADER_TEXTURE_LOD]=0
    clgpu[0].CLPlatformVersion="1.1"
    clgpu[0].CLDeviceVersion="1.1 CUDA"
    clgpu[0].CLMemoryMB=2048
    clgpu[0].CLName="GeForce GT 650M"
    clgpu[0].CLVendor="NVIDIA Corporation"
    clgpu[0].CLVendorID=4318
    clgpu[0].CLDriverVersion="331.65"
    clgpu[0].CUDASupported=1
    clgpu[0].CUDAVersion="6.0.1"
    clgpu[0].CLBandwidth=4.46455e+010
    clgpu[0].CLCompute=150.537
    clgpu[1].CLPlatformVersion="1.1 "
    clgpu[1].CLDeviceVersion="1.1 "
    clgpu[1].CLMemoryMB=1624
    clgpu[1].CLName="Intel(R) HD Graphics 4000"
    clgpu[1].CLVendor="Intel(R) Corporation"
    clgpu[1].CLVendorID=32902
    clgpu[1].CLDriverVersion="8.15.10.2712"
    clgpu[1].CUDASupported=0
    clgpu[1].CLBandwidth=1.40476e+010
    clgpu[1].CLCompute=84.3036
    License Type: Subscription
    Serial number: 96040231285882737802
    Application folder: C:\Program Files\Adobe\Adobe Photoshop CC 2014\
    Temporary file path: C:\Users\Keith\AppData\Local\Temp\
    Photoshop scratch has async I/O enabled
    Scratch volume(s):
      Startup, 909.7G, 639.9G free
    Required Plug-ins folder: C:\Program Files\Adobe\Adobe Photoshop CC 2014\Required\Plug-Ins\
    Primary Plug-ins folder: C:\Program Files\Adobe\Adobe Photoshop CC 2014\Plug-ins\
    Installed components:
       A3DLIBS.dll   A3DLIB Dynamic Link Library   9.2.0.112  
       ACE.dll   ACE 2014/08/12-23:42:09   79.557478   79.557478
       adbeape.dll   Adobe APE 2013/02/04-09:52:32   0.1160850   0.1160850
       AdbePM.dll   PatchMatch 2014/09/07-21:07:38   79.558079   79.558079
       AdobeLinguistic.dll   Adobe Linguisitc Library   8.0.0  
       AdobeOwl.dll   Adobe Owl   5.2.4  
       AdobePDFL.dll   PDFL 2014/08/18-15:13:12   79.512424   79.512424
       AdobePIP.dll   Adobe Product Improvement Program   7.2.1.3399  
       AdobeXMP.dll   Adobe XMP Core 2014/08/20-09:53:02   79.156797   79.156797
       AdobeXMPFiles.dll   Adobe XMP Files 2014/08/20-09:53:02   79.156797   79.156797
       AdobeXMPScript.dll   Adobe XMP Script 2014/08/20-09:53:02   79.156797   79.156797
       adobe_caps.dll   Adobe CAPS   8,0,0,13  
       AGM.dll   AGM 2014/08/12-23:42:09   79.557478   79.557478
       ahclient.dll    AdobeHelp Dynamic Link Library   1,8,0,31  
       amtlib.dll   AMTLib (64 Bit)   8.0.0.122212002 BuildVersion: 8.0; BuildDate: Wed Jul 30 2014 15:59:34)   1.000000
       ARE.dll   ARE 2014/08/12-23:42:09   79.557478   79.557478
       AXE8SharedExpat.dll   AXE8SharedExpat 2013/12/20-21:40:29   79.551013   79.551013
       AXEDOMCore.dll   AXEDOMCore 2013/12/20-21:40:29   79.551013   79.551013
       Bib.dll   BIB 2014/08/12-23:42:09   79.557478   79.557478
       BIBUtils.dll   BIBUtils 2014/08/12-23:42:09   79.557478   79.557478
       boost_date_time.dll   photoshopdva   8.0.0  
       boost_signals.dll   photoshopdva   8.0.0  
       boost_system.dll   photoshopdva   8.0.0  
       boost_threads.dll   photoshopdva   8.0.0  
       cg.dll   NVIDIA Cg Runtime   3.0.00007  
       cgGL.dll   NVIDIA Cg Runtime   3.0.00007  
       CIT.dll   Adobe CIT   2.2.6.32411   2.2.6.32411
       CITThreading.dll   Adobe CITThreading   2.2.6.32411   2.2.6.32411
       CoolType.dll   CoolType 2014/08/12-23:42:09   79.557478   79.557478
       dvaaudiodevice.dll   photoshopdva   8.0.0  
       dvacore.dll   photoshopdva   8.0.0  
       dvamarshal.dll   photoshopdva   8.0.0  
       dvamediatypes.dll   photoshopdva   8.0.0  
       dvametadata.dll   photoshopdva   8.0.0  
       dvametadataapi.dll   photoshopdva   8.0.0  
       dvametadataui.dll   photoshopdva   8.0.0  
       dvaplayer.dll   photoshopdva   8.0.0  
       dvatransport.dll   photoshopdva   8.0.0  
       dvaui.dll   photoshopdva   8.0.0  
       dvaunittesting.dll   photoshopdva   8.0.0  
       dynamiclink.dll   photoshopdva   8.0.0  
       ExtendScript.dll   ExtendScript 2014/01/21-23:58:55   79.551519   79.551519
       icucnv40.dll   International Components for Unicode 2013/02/25-15:59:15    Build gtlib_4.0.19090  
       icudt40.dll   International Components for Unicode 2013/02/25-15:59:15    Build gtlib_4.0.19090  
       igestep30.dll   IGES Reader   9.3.0.113  
       imslib.dll   IMSLib DLL   7.0.0.154  
       JP2KLib.dll   JP2KLib 2014/06/28-00:28:27   79.254012   79.254012
       libifcoremd.dll   Intel(r) Visual Fortran Compiler   10.0 (Update A)  
       libiomp5md.dll   Intel(R) OpenMP* Runtime Library   5.0  
       libmmd.dll   Intel(r) C Compiler, Intel(r) C++ Compiler, Intel(r) Fortran Compiler   12.0  
       LogSession.dll   LogSession   7.2.1.3399  
       mediacoreif.dll   photoshopdva   8.0.0  
       MPS.dll   MPS 2014/08/18-23:43:19   79.557676   79.557676
       pdfsettings.dll   Adobe PDFSettings   1.04  
       Photoshop.dll   Adobe Photoshop CC 2014   15.2.2  
       Plugin.dll   Adobe Photoshop CC 2014   15.2.2  
       PlugPlugExternalObject.dll   Adobe(R) CEP PlugPlugExternalObject Standard Dll (64 bit)   5.0.0  
       PlugPlugOwl.dll   Adobe(R) CSXS PlugPlugOwl Standard Dll (64 bit)   5.2.0.54  
       PSArt.dll   Adobe Photoshop CC 2014   15.2.2  
       PSViews.dll   Adobe Photoshop CC 2014   15.2.2  
       SCCore.dll   ScCore 2014/01/21-23:58:55   79.551519   79.551519
       ScriptUIFlex.dll   ScriptUIFlex 2014/01/20-22:42:05   79.550992   79.550992
       svml_dispmd.dll   Intel(r) C Compiler, Intel(r) C++ Compiler, Intel(r) Fortran Compiler   12.0  
       tbb.dll   Intel(R) Threading Building Blocks for Windows   4, 2, 2013, 1114  
       tbbmalloc.dll   Intel(R) Threading Building Blocks for Windows   4, 2, 2013, 1114  
       TfFontMgr.dll   FontMgr   9.3.0.113  
       TfKernel.dll   Kernel   9.3.0.113  
       TFKGEOM.dll   Kernel Geom   9.3.0.113  
       TFUGEOM.dll   Adobe, UGeom©   9.3.0.113  
       updaternotifications.dll   Adobe Updater Notifications Library   8.0.0.14 (BuildVersion: 1.0; BuildDate: BUILDDATETIME)   8.0.0.14
       VulcanControl.dll   Vulcan Application Control Library   5.0.0.82  
       VulcanMessage5.dll   Vulcan Message Library   5.0.0.82  
       WRServices.dll   WRServices Fri Mar 07 2014 15:33:10   Build 0.20204   0.20204
       wu3d.dll   U3D Writer   9.3.0.113  
    Required plug-ins:
       3D Studio 15.2.2 (2014.2.2 x001 x003)
       Accented Edges 15.2.2
       Adaptive Wide Angle 15.2.2
       Angled Strokes 15.2.2
       Average 15.2.2 (2014.2.2 x001 x003)
       Bas Relief 15.2.2
       BMP 15.2.2
       Camera Raw 8.7.1
       Camera Raw Filter 8.7.1
       Chalk & Charcoal 15.2.2
       Charcoal 15.2.2
       Chrome 15.2.2
       Cineon 15.2.2 (2014.2.2 x001 x003)
       Clouds 15.2.2 (2014.2.2 x001 x003)
       Collada 15.2.2 (2014.2.2 x001 x003)
       Color Halftone 15.2.2
       Colored Pencil 15.2.2
       CompuServe GIF 15.2.2
       Conté Crayon 15.2.2
       Craquelure 15.2.2
       Crop and Straighten Photos 15.2.2 (2014.2.2 x001 x003)
       Crop and Straighten Photos Filter 15.2.2
       Crosshatch 15.2.2
       Crystallize 15.2.2
       Cutout 15.2.2
       Dark Strokes 15.2.2
       De-Interlace 15.2.2
       Dicom 15.2.2
       Difference Clouds 15.2.2 (2014.2.2 x001 x003)
       Diffuse Glow 15.2.2
       Displace 15.2.2
       Dry Brush 15.2.2
       Eazel Acquire 15.2.2 (2014.2.2 x001 x003)
       Embed Watermark 4.0
       Entropy 15.2.2 (2014.2.2 x001 x003)
       Export Color Lookup Tables NO VERSION
       Extrude 15.2.2
       FastCore Routines 15.2.2 (2014.2.2 x001 x003)
       Fibers 15.2.2
       Film Grain 15.2.2
       Filter Gallery 15.2.2
       Flash 3D 15.2.2 (2014.2.2 x001 x003)
       Fresco 15.2.2
       Glass 15.2.2
       Glowing Edges 15.2.2
       Google Earth 4 15.2.2 (2014.2.2 x001 x003)
       Grain 15.2.2
       Graphic Pen 15.2.2
       Halftone Pattern 15.2.2
       HDRMergeUI 15.2.2
       HSB/HSL 15.2.2
       IFF Format 15.2.2
       IGES 15.2.2 (2014.2.2 x001 x003)
       Ink Outlines 15.2.2
       JPEG 2000 15.2.2
       Kurtosis 15.2.2 (2014.2.2 x001 x003)
       Lens Blur 15.2.2
       Lens Correction 15.2.2
       Lens Flare 15.2.2
       Liquify 15.2.2
       Matlab Operation 15.2.2 (2014.2.2 x001 x003)
       Maximum 15.2.2 (2014.2.2 x001 x003)
       Mean 15.2.2 (2014.2.2 x001 x003)
       Measurement Core 15.2.2 (2014.2.2 x001 x003)
       Median 15.2.2 (2014.2.2 x001 x003)
       Mezzotint 15.2.2
       Minimum 15.2.2 (2014.2.2 x001 x003)
       MMXCore Routines 15.2.2 (2014.2.2 x001 x003)
       Mosaic Tiles 15.2.2
       Multiprocessor Support 15.2.2 (2014.2.2 x001 x003)
       Neon Glow 15.2.2
       Note Paper 15.2.2
       NTSC Colors 15.2.2 (2014.2.2 x001 x003)
       Ocean Ripple 15.2.2
       OpenEXR 15.2.2
       Paint Daubs 15.2.2
       Palette Knife 15.2.2
       Patchwork 15.2.2
       Paths to Illustrator 15.2.2
       PCX 15.2.2 (2014.2.2 x001 x003)
       Photocopy 15.2.2
       Photoshop 3D Engine 15.2.2 (2014.2.2 x001 x003)
       Photoshop Touch 14.0
       Picture Package Filter 15.2.2 (2014.2.2 x001 x003)
       Pinch 15.2.2
       Pixar 15.2.2 (2014.2.2 x001 x003)
       Plaster 15.2.2
       Plastic Wrap 15.2.2
       PLY 15.2.2 (2014.2.2 x001 x003)
       PNG 15.2.2
       Pointillize 15.2.2
       Polar Coordinates 15.2.2
       Portable Bit Map 15.2.2 (2014.2.2 x001 x003)
       Poster Edges 15.2.2
       PRC 15.2.2 (2014.2.2 x001 x003)
       Radial Blur 15.2.2
       Radiance 15.2.2 (2014.2.2 x001 x003)
       Range 15.2.2 (2014.2.2 x001 x003)
       Read Watermark 4.0
       Render Color Lookup Grid NO VERSION
       Reticulation 15.2.2
       Ripple 15.2.2
       Rough Pastels 15.2.2
       Save for Web 15.2.2
       ScriptingSupport 15.2.2
       Shake Reduction 15.2.2
       Shear 15.2.2
       Skewness 15.2.2 (2014.2.2 x001 x003)
       Smart Blur 15.2.2
       Smudge Stick 15.2.2
       Solarize 15.2.2 (2014.2.2 x001 x003)
       Spatter 15.2.2
       Spherize 15.2.2
       Sponge 15.2.2
       Sprayed Strokes 15.2.2
       Stained Glass 15.2.2
       Stamp 15.2.2
       Standard Deviation 15.2.2 (2014.2.2 x001 x003)
       STL 15.2.2 (2014.2.2 x001 x003)
       Sumi-e 15.2.2
       Summation 15.2.2 (2014.2.2 x001 x003)
       Targa 15.2.2
       Texturizer 15.2.2
       Tiles 15.2.2
       Torn Edges 15.2.2
       Twirl 15.2.2
       U3D 15.2.2 (2014.2.2 x001 x003)
       Underpainting 15.2.2
       Vanishing Point 15.2.2
       Variance 15.2.2 (2014.2.2 x001 x003)
       Virtual Reality Modeling Language | VRML 15.2.2 (2014.2.2 x001 x003)
       Water Paper 15.2.2
       Watercolor 15.2.2
       Wave 15.2.2
       Wavefront|OBJ 15.2.2 (2014.2.2 x001 x003)
       WIA Support 15.2.2 (2014.2.2 x001 x003)
       Wind 15.2.2
       Wireless Bitmap 15.2.2 (2014.2.2 x001 x003)
       ZigZag 15.2.2
    Optional and third party plug-ins: NONE
    Plug-ins that failed to load: NONE
    Flash:
       Libraries
       Adobe Color Themes
    Installed TWAIN devices: NONE

    Try a preference reset: Press Cmd+Opt+Shift (Mac) or Ctrl+Alt+Shift (Win) immediately after initiating Photoshop startup. Select "yes" when prompted.

  • Please help with strange monitor/Lightroom 4 interaction problem

    I bought a Dell Ultrasharp U3011 yesterday, and I'm now facing a very strange problem. It's sort of related to the common discussed issue of Lightroom's color management, but there is a twist here. I'll explain.
    The monitor I had been using (and still have) was a Samsung SyncMaster 244T. I had no problems at all with it.
    With the U3011, I've run into this problem:
    -When I view files in Lightroom 4, the colors are muted and contrast low. The same files seen in Xnview, ACDSee Pro (with the color management turned off), Digital Photo Professional (colorspace set to sRGB), are much more vivid. In Photoshop, if I use the default color setting, it looks muted like Lightroom (I'm guessing because Adobe products have default color profile set to ProPhoto RGB), but if I set the color profile to the monitor's profile, it becomes vivid and matches the other software I mentioned. If I use any other profile such as sRGB IEC81966-2.1, ProPhoto RGB, Adobe RGB, etc, it doesn't match the other software I mentioned. This is strange because the other ones match each other because they are set to sRGB, but in Photoshop, even if I set the color profile to sRGB, it still doesn't match.
    -In Lightroom 4, even if I use Soft Proofing, nothing changes. It's as if Soft Proofing is broken--it doesn't do anything, no matter what color profile I choose.
    -Now, here's the twist. I didn't have any of these problems when I use the 244T. This leads me to assume that the problem is how the U3011 is interacting with Lightroom. I did a quick comparison of opening up the same image on both monitors in ACDSee Pro, and they looked identical. (At this point, I had the U3011's preset set to sRGB. This is the only preset that makes the U3011 look identical to the 244T).
    Next, I did an experiment, and I was very surprised by what happened. This is what I did:
    With both monitors as a dual display setup, I turned on Before/After comparison in Lightroom (made sure both version had the same setting and look identical by copying the After's setting to Before). I then moved Lightroom's window so that one photo is seen on the U3011, and the other shows up on the 244T. And OMIGOD, they looked different! Why is this happening? I've already established that the same image looks identical on both monitors when viewed with ACDSee Pro, but in Lightroom, they looked different on each monitor.
    And then things got even more weird. When I dragged the Lightroom window fully to one monitor or the other, Lightroom actually started changing the colors right before my eyes as it responded to which monitor was displaying which photo! Lightroom was somehow interacting with the monitor in real-time. So if the Before looked muted on U3011 and the After looked vivid on the 244T, when I drag the whole window into 244T, the muted Before will stay muted for about a second or so, and then it'll suddenly change and become identical to After. And when I did the reverse (drag the window into U3011), the reverse also happened--the vivid After will stay vivid on U3011 for about a second or so, and then it would suddenly change and become muted. Lightroom was identifying each monitor's color profile setting and changing the images according to which monitor was doing the displaying, and when the images were split between the two monitors, Lightroom was slitting them into two different settings to match whichever monitor is displaying which side.
    So what the hell does that all mean, and what can I do to get Lightroom to display the photos so that they look the same as on all the other software I mentioned (so I don't have to play the guessing game of editing the images, export them out to other software to check how they turned out). Also, remember that I didn't have this problem with the 244T--Lightroom displayed the photos the same way as the other software I mentioned. The problem right now is that the U3011 is somehow causing Lightroom to display the photos differently, and no color profile presets in the OSD control can make them match.
    I need to solve this problem, or else my last resort would be to return the U3011 and try another brand/model and see if that helps.
    Any ideas?

    Yes, that's basically sensible advice you got there (but I was more succinct ).
    An important point that was mentioned over there is that you must not confuse document/source profile with monitor profile. They are both links in the color management chain, but they serve different purposes. The document profile defines the colors in a standard color space. The monitor profile describes the monitor's own native color space - its response to the RGB numbers it's fed. So one is translated into the other, by the application on the fly, through a normal profile conversion, and that way the colors appear on-screen as intended.
    If those two profiles/color spaces are roughly similar, as they would be in the case of an sRGB document displayed on a standard gamut display - then you can get by without color management. It won't look too far off. But a wide gamut monitor changes the rules, because its native color space is not similar to sRGB, it's much closer to Adobe RGB.
    Anyway, this is what color management is about. Any profile can be translated into any other. So in this context it doesn't really matter what color space the document is in - as long as you have a good monitor profile it will display correctly (within practical limitations). Lightroom is a sort of special case because its internal working color space is not one of the standard ones, instead it's a modified ProPhoto space with linear gamma. But the basic principle is exactly the same: source profile > monitor profile.
    Just to go a little more in-depth while I'm at it: When you calibrate a display you actually do two separate things. You first calibrate it to a more or less standardized response. But that's just a linear one-dimensional correction. It does nothing about how color is actually reproduced, how red is red for instance. The calibration is global and affects everything system-wide because it's read into the video card or monitor hardware.
    Then, once calibrated, a monitor profile is made. This is a full, complete and accurate description of the monitor's behavior, in three-dimensional color space. The precision level is much higher. This profile is used, by color managed applications like Lightroom and Photoshop, to display the image.
    Other applications that are not color managed will simply ignore this profile, and send the RGB numbers straight through to the display, unmodified. With a standard gamut monitor (and an sRGB document) the difference won't be too dramatic and you may not notice it unless you look closely. But a wide gamut monitor will make the difference jump out immediately.

  • NOT happy with image quality of Lightroom 1.1

    Sure, LR now launches faster and the interface looks a bit nicer. And the more capable sharpening controls and the clarity slider which mimics contrast enhancement with USM are nice additions, but has anyone else notice what happened to the image quality?
    First, while formerly LR and ACR struck a great balance between detail and noise suppressionerring on the side of maintaining detail even at the expense of slightly higher noise levelsit appears the goal for the redesign has been to minimize the appearance of noise at all costs. It just so happens that yesterday afternoon, I'd shot some available light candids (up to ISO 800) of the staff at a local health care facility and was intent on using them as a trial run on Lightroom 1.1. Well, the difference in image quality jumped right out at me: there was no granular noise at all remaining, even in the ISO 800 shots, but neither was there any fine detail. I use a Canon 5D, and while I'm accustomed to slightly higher levels of chroma noise, images up to ISO 1600 in even the worse lighting are always full of fine detail. Fine structures like strands of hair and eye lashes have now lost their delicacy, and have instead become coarse, unnaturally painterly analogs. Looking into shadow areas, I can see the results of what seems to be luminance noise smearing at work, obliterating noise and detail along with it. I never used Raw Shooter because I'm a Mac user (2x2GHz G5 w/2GB RAM and 250GB HD), but if this is the result of incorporating Pixmantic's technology, the result is not a positive one from my standpoint. The images I shot yesterday are to be cropped to 4:5 proportions, then printed 20" x 25", at which size the processing artifacts and lack of fine detail in these LR1.1 conversions becomes even more apparent. I've even tried turning off all image processing options: Clarity, Sharpening and NR (neither of which I ever use in RAW conversion, anyway)... It simply seems this noise smearing is part of the baseline RAW processing, and it really, really bites. Am I missing something? Is there some way to actually turn off this processing that looks uncomfortably like the "watercolor" noise reduction that Kodak and Panasonic use for their compact digicams. Yuck!
    Secondly, is there a way to get back the suppression of hot and stuck pixels that LR used to perform? Now, my high ISO files are riddled with them, the same as they would be when converted with Aperture or Canon's DPP. Default suppression of hot and stuck pixels was a major advantage of LR/ACR, and contributed in no small bit to my adoption of LR as my standard tool for RAW conversion due to the amount of high ISO, low light photography I do. What's even worse, is that the random-color speckles are now smudged into the image along with all the other noise data that's being smoothed out, resulting in images that looks more like impressionist paintings than photographs.
    I thought about reinstalling LR1.0 and just continuing to use that, but if LR1.1 is an indication of the direction Adobe is going to take in the development of the software, I really don't see the point of continuing to use the softwareparticularly when I had a few existing problems with LR1.0 that were never resolved, such as crashing during the import of photos from a memory card and progressively slower preview rendering as the size of my library increased. So, I'm probably going to go back to using Aperture, which is itself not free of IQ foibles, but certainly looks much more attractive now in comparison to LR1.1.
    Anybody notice the same things with IQ? Anybody got any suggestions of how to get more natural-looking conversions before I remove LR and go back to Aperture?

    Jeff,
    I mean no disrespect. But I would like to see samples of 1.1 compared to 1.0 of the same image (ISO 400, and/or 800), because I do not want to convert my library to a catalog until I know whether or not I like the image quality. Why is it so hard to get one good sample. That is all I am asking. I would just rather not jump through hoops to go back to 1.0 if I do not like 1.1....That is all
    And yes, after well over 400 printed articles I can tell what an image will look like in print when I view it 1:1.... I can tell if the eyelashes or pores on someones face, the detail in a rug, or wood grain will be detailed on the off set printed page if I look at the image at 1:1 and see smudging...this means to me that the most detail possible is NOT going to translate to the page. If however I CAN see detail in those types of areas, clearly (ie no smudging), than I know that I will see those fine details on the page. If these fine details were not important than we would all still be shooting with 3 and 4 mp cameras. Those fine details that are only visible to our eyes at a 1:1 preview on screen, are important on the printed page.
    Oh, and I am not chest thumping. You can check my history here, I do not have a history of that type of activity. I am simply asking to see samples before I update....
    I am very discriminating Pro, not some over testing, too much time on my hands, complaining , over paid amateur who only has time to complain that their test chart is out of focus. Or that they can measure toooo much noise at ISO what ever, instead of actually making photos. I actually make my living taking photos. And my clients have come to expect a certain level of quality from me. They comment all the time how much higher quality my images are than some of the other photogs they use. And I am still shooting a D60, where as these others are shooting 5d's and D2X's.
    Jeff, I am not against you or Adobe. Matter of fact, I LOVE LR. It has changed my work flow in a very positive direction. I think it is wonderful. I just want one sample.... I am asking nicely: Please with sugar on top :)
    If you can't give me a sample, than please at least reassure me that it will be easy to go back to 1.0 for the time being. Is it as easy as uninstalling 1.1, reinstalling 1.0 and recovering my DB from a current backup? If so, than fine, I will go this route........... If not, than I am hoping for a sample.
    Thank you very kindly Jeff for engaging in this lively conversation. I do appreciate your comments and participation on this forum. And please note that none of this is said with attitude or malice. I know that some times a writers intent or emotional state is easy to misinterpret in a forum like this. So please know that I am calm and not angry, just curious about image quality.
    Ok. I will shut up now. Thanks again

  • Cube with non-additive measures

    Dimension "Dim_Geography" has levels ZIP_CODE -> CITY -> STATE
    e.g. 12345 -> Schenectady -> NY
    Dimension "Dim_Store" has levels STORE -> STORE_BRAND -> STORE_LINE
    e.g. "Cathy's Fashions #1234" -> "Cathy's Fashions" -> "Clothing, Women's"
    Dimension "Dim_Calendar" has levels DAY -> WEEK -> MONTH -> QUARTER -> YEAR (assume 4-4-5 calendar to make roll-up work nicely)
    (No example given, it's just a calendar)
    Fact "Fact_Visits" has dimensionality <Dim_Geography, Dim_Store, Dim_Calendar> and measures Visitor_Count and Visitor_Value.
    The fact is based on a table called "TBL_VISITS" having columns (Geog_ID, Store_ID, Cal_ID, Person_ID, Visitor_Value, Visit_Timestamp), all of which are integers except the last which is a timestamp.
    Visitor Value is a number that was calculated earlier and represents a unit value of the visitor. Visitor Value changes daily for each Person_ID. When querying for any level of Calendar other than DAY, the correct Visitor Value is the last one seen for the Person_ID within a given week/month/quarter/year. If no level of Calendar is queried, the last Visitor Value is the correct one.
    A row in this table represents a single visit of the person to the store in that zip code. A given Person_ID can appear many times in a day in any given store in any given zip code. Assume we can tell visitors apart so that we can populate "Person_ID" properly.
    I can create a series of views that collapses TBL_VISITS to obtain the correct Visitor_Count and Visitor_Value at any combination of dimensionality. (Obviously, I can't create a view that collapses them all.)
    Example:
    CREATE VW_FACT_BY_DAY_STORE AS
      SELECT cal_id, store_id, SUM(v_visitor_value) as visitor_value, SUM(v_visitor_count) as visitor_count
        FROM (SELECT cal_id, store_id, person_id, SUM(visitor_value) / SUM(1) as v_visitor_value, SUM(1) as visitor_count
                    FROM tbl_visits
                  GROUP BY cal_id_store_id, person_id
                 )The more complicated views (i.e. views omitting or having higher calendar dimensionality), I can compose with analytical functions. The number of views is big (essentially, all possible combinations of dimension levels), but not unmanageable.
    A long winded setup for a bunch of short questions:
    1. How do I specify all that in AWM?
    2. Can AWM handle the outer joins for views that drop out one or more dimensions?
    3. Is there anything special I should do?
    4. Am I crazy for wanting to do this???
    THANKS !!!

    Thanks for replying. Since the requirement needed diff values at different levels, in the calc script, I fixed on dimensionsI want to include & it works great for a really small set of data. When I load the actual data file & try to calc, it comes up with the following essbase error.I tried to increase the datacache size but the error does not go away. Does anyone have any suggestion on why it works for a subset of data and what needs to be done to over come the error.Thanks a lot.-Nithya[Fri Apr  5 09:59:18 2002]Local/CPTest2/Cptest2/nvenka/Serious Error(1008110)Unable to Allocate Memory. Refer to the Application Log for details.[Fri Apr  5 09:59:18 2002]Local/CPTest2/Cptest2/nvenka/Info(1008112)Memory Allocation error codes: O/S error = [12], O/S return code = [0].[Fri Apr  5 09:59:18 2002]Local/CPTest2/Cptest2/nvenka/Info(1008114)Allocation request for [1481760000] bytes of virtual memory. Percentage of memory in use is [28%].[Fri Apr  5 09:59:18 2002]Local/CPTest2/Cptest2/nvenka/Info(1008116)Total swap space is [25652363264] bytes. Available swap space is [18546425856] bytes.[Fri Apr  5 09:59:18 2002]Local/CPTest2/Cptest2/nvenka/Info(1008119)Memory page size is [4096] bytes.[Fri Apr  5 09:59:18 2002]Local/CPTest2/Cptest2/nvenka/Error(1019003)Unable To Read Information From [db2olap/app/CPTest2/Cptest2/Cptest2.otn][Fri Apr  5 09:59:18 2002]Local/CPTest2/Cptest2/nvenka/Error(1007119)Loading New Outline for Database [Cptest2] Failed[Fri Apr  5 09:59:19 2002]Local/CPTest2/Cptest2/nvenka/Error(1042012)Network error [32]: Cannot Send Data

  • Request better support skin tone evaluation/measurement

    Hello. I am writing this with the intention that it will be read by the folks at Adobe that are involved in the development of Lightroom. I'm pretty new to this forum, however, and I'm a little confused about whether this feature requests thread actually goes to the Adobe Lightroom team, considering that it is in a user-to-user forum. If there is a better avenue to get my feature request to Adobe, I would appreciate if someone could point me in the right direction.
    Before I get into my request, I also want to mention that I did my due diligence and searched this thread to make sure this has not been requested before.
    I am submitting this request because I do not see any useful way to measure or evaluate skin tone in Lightroom while adjusting white balance (or after adjusting white balance, for that matter). I can do a mouse-over to read RGB values, but I am not aware of any useful way to use RGB values for evaluating skin tones.
    I just watched the latest George Jardine video, in which he recommends to use a calibrated monitor and move the Develop controls back and forth until your eye tells you it's correctly adjusted. I enjoyed this video, and I have found that this generally works well for me for tone balance, but I believe an additional tool for measuring or evaluating skin tones would benefit the Lightroom workflow as I will explain below.
    I believe that many serious photographers, pro and amateur alike, routinely use the eyedropper in Photoshop for reading CMYK values to confirm the skin tones in their work. Even if they feel like they can usually eyeball pretty well, they find they get greater consistency when they use the eyedropper.
    Now I'm not saying that Lightroom necessarily needs CMYK support. Photoshop Elements, for example has a skin tone adjustment even though it doesn't have CMYK support. And I'm also not suggesting that Lightroom necessarily needs skin tone sliders like Photoshop Elements. I'm just suggesting that the Lightroom workflow would benefit from some kind of tool for evaluating skin tone while or after adjustments are being made in the Develop Module. I would like to leave it up to Lightroom to decide exactly how to implement this.
    The only way I currently see to do adjust while measuring skin tones is to open the file in Photoshop, make adjustments, and save. Even if there is a way to do this with ACR and have the adjustments saved in the sidecar or in the DNG, it still seems like a time-comsuming and unnecessary step for my workflow.
    Now this request is predicated on the assumption that evaluting skin tone is fundamental enough for a basic workflow that it should be included in Lightroom. In my opinion it is, and that is why I am making this feature request. I'm sure that some might not need it for their workflow, but it seems to me that this would be a valuable feature to a great many Lightroom users.
    Thanks for lending your ear, Adobe. I look forward to ALL your future versions of Lightroom, and I hope that skin tone evaluation/measurement is included in one of them.
    Regards,
    Mike

    Your post seems to assume that Lightroom is a tool for travel/landscape photography, and other types of photography (e.g., portrait/fashion) should be supported by a "specialized add-on module". I have to disagree with you on that point. Considering many of the examples on the Lightroom marketing are fashion shoots, I would think that they considers portrait/fashion photographers to be an important part of their target audience. They are not a fringe group of specialists.
    I'm sure that portrait/fashion photographers would feel the same way about a Lightroom capability that primarily benefits the workflow of a travel/landscape photographer, i.e., when I do do some landscape work, I just edit in Photoshop. But you wouldn't agree to that, would you?
    Skin tone measurement can be an incredibly easy tool to implement. It can be something as simple as showing the CMY values alongside the RGB values during a mouseover. Keep in mind, I'm talking about CMY not CMYK, so there should be no need to worry about what ICC profile to use. RGB to CMY is a straightforward transformation. It's embarassingly simple.
    There are other ways Adobe can implement skin tone management that would be more powerful but a little more complicated. Those would be great too.
    Anyway, thanks for the link to the Adobe feature request page! I will use it.
    Regards,
    Mike

  • Best practice for calc measures

    Hi,
    I read in one of the olap blog from experts which says that best practice for creating the calc measures is to have them in a seperate cube.
    My question is that suppose I have two cubes in a AW of different dimensionality. Both the cubes has some calc measures to be created.
    So can I create one more cube(dedicated to calc measures) considering all the available dimensions in the AW and create the calc measures for both the cube which has different dimensionality?
    If I cannot do this then this means that if I have two cubes having different dimensionality then I would need two more cubes to create the calc measures for them? if yes then what would be the size implecation of the AW?
    Thanks in advance.
    Thanks
    Brijesh

    "Can I create one more cube(dedicated to calc measures) considering all the available dimensions in the AW and create the calc measures for both the cube which has different dimensionality?"
    Yes, you can.
    This is a pretty common thing to do. Store your base measures efficiently and then use calculated measures with the 'superset' dimensionality to bring different shaped measures together for the purposes of calculations or simply to get them all in one 'hypercube' to simplify access by SQL query tools and apps.
    Kevin

  • Non dimensional attributes in a fact table

    i would like to add 2 fields into my fact tables - LOAD ID (populated by a sequence during each load) and LOAD DATE.
    as these fields are not related to any reporting dimensions, it is still possible to add them in OWB ? the fact wizard always ask for a foreign key to a dimension ...

    i would like to add 2 fields into my fact tables - LOAD ID (populated by a sequence during each load) and LOAD DATE.
    as these fields are not related to any reporting dimensions, it is still possible to add them in OWB ? the fact wizard always ask for a foreign key to a dimension ...
    Duncan,
    If you want to add non dimensional attributes to a fact by using OWB, you can create additional measures to it and use them as attributes.
    Igor

  • Level Based Measure and a column based on that

    Hello All,
    I need suggestions to implement the following requirement.
    1. No of orders stored in a fact table at day level
    2. No. of orders at month level should be calculated
    3. Depending on the count, each month would be assigned to a bucket.
    0-10 orders -> B1
    11-100 orders -> B2
    > 100 orders -> B3
    What I did in RPD
    1. fact joined to time dimension
    2. created a logical column: ORDER_COUNT: count(distinct order_number) iand set the level = month
    3. created a logical column: BUCKET: using the output of above logical column and CASE statement to implement the bucket rule.
    Now, when I run a report like this:
    Year; ORDER_COUNT; BUCKET
    Returns me 12 rows. Understood because ORDER_COUNT is a month level measure
    E.g.
    2011 5 B1
    2011 5 B1
    2011 15 B2
    2011 15 B2
    2011 5 B1
    2011 5 B1
    2011 15 B2
    2011 15 B2
    2011 5 B1
    2011 5 B1
    2011 15 B2
    2011 15 B2
    But as soon as I pull the BUCKET column as column heading in pivot table, it returns me only the 1st month row and changes the "BUCKET" value to B3.
    Looks like, it recalculates the bucket and sums up all 12 months orders and then applies the bucket rule.
    Like this -
    B3
    2011 5
    My expectation was something like this
    B1 B2 B3
    2011 30 90 0
    where 30 is the sum of the order_counts of all B1 months, <m>
    Ques:
    1. Is there a way I can avoid the recalculation of bucket information when the granularity/dimensionality changes in the report
    2. I read in the below link that if the grain of the report > grain of the level-based-measure then the output of the report is "not-deterministic"
    Am I hitting that scenario?
    [http://gerardnico.com/wiki/dat/obiee/measure_level_based]
    Any suggestion would be highly appreciated.
    I am using 11.1.1.6.2 version.
    Thanks & Regards
    Monika

    Hi ,
    If I understand your requirement  correctly , then you need currency Conversion.
    you can refer this links.
    http://consultingblogs.emc.com/christianwade/archive/2006/08/24/Currency-Conversion-in-Analysis-Services-2005.aspx
    http://www.ssas-info.com/analysis-services-faq/27-mdx/244-how-change-currency-symbol-based-on-selected-currency-dimension-member
    http://social.technet.microsoft.com/wiki/contents/articles/18672.currency-conversion-in-ssas-2012-multidimensional-tabular.aspx
    May I know , You Require all this using MDX and SSRS Parameter ?
    Thanks

Maybe you are looking for

  • Left Outer Join with more than two tables in both CR 8.5 and XIR2

    I am trying to create a report and I have tried two versions of Crystal but I ultimately need the report in CR8.5 for compatibility with the client's system.  I think I am starting to figure out what might be wrong and it might be my understanding of

  • Exist a possibility to install windows xp in macbook pro 2011

    exist a possibility to install windows xp in macbook pro 2011

  • Lost indesign file please help

    Hi guys, last night I was working on some coursework, I had just done saves throughout the night multiple times to the same name as ''Design.indd'' on my desktop. I then plugged in my HDD. I then right click ''save as'' and then went over my design.i

  • Is there a way to have applications run in the background after I log out?

    New to Mavericks, have a new iMac after being on Leopard for years. After I log out, all the applications I had running stop and close. Is there a way to keep those running? It seems like logging out, and the applications stopping, would negate the a

  • Mac Mini Ethernet dead.

    The title is pretty self-explanatory--I have tried installing the restore discs and it didn't help any. I can't go to the Apple Store or any authorized Apple repair people because they're in Utah, or because they're insanely far away from my town. I