Incorrect Table Counts after Import

Hi,
I'm currently involved in an upgrade from 4.6C to ECC6 EHP4 based on NW701. The upgrade and the export has completed successfully. I've completed the import successfully but have noticed that the tables counts on EDI40 & RFBLG have increased significantly, difference of 1367079 records for EDI40 & 613166 records for RFBLG.
I did have a problem during the import where the server crashed but these packages were not running at that time. They had completed prior to the hardware failure. The import logs all state that the packages have completed successfully. Both tables were split with the corresponding column specified from the R3hints file.
Environment Details:
OS: Linux RH4
DB: Oracle 10.2.0.4
Has anyone experienced this before? This now leads to doubts about the validity of all the other packages imported. Is there a way to verify the import besides doing table counts?
Thanks,
Cheng
P.S: These are cluster tables and have since found OSS Note 1356472. I'd still be interested in any comments regarding this topic
Edited by: Chengappa Ballachanda on Apr 19, 2010 11:33 AM

Confirmed by SAP.

Similar Messages

  • Incorrect song list after import from CD

    After importing, iTunes showed an incorrect songlist -- a different CD from the same artist. I manually corrected all the song titles.
    It also will not get the album artwork, even though the CD is in the iTunes store for purchase - why won't it import the album in the coverflow mode?

    Hi Mom-o-Val.
    Welcome to the Apple Disscussion Boards!
    There could be more than a couple reasons why iTunes is pulling the wrong info for your CD.
    Could you tell me what CD it is?
    mwn

  • DB Tables creation after import

    Currently I'm working on a project in which I have to copy all the ETL processes to another repository and create separated data warehouse based on them. The repository copy itself is easy, I simply export and import it using Data Services (v. XI). Now as this is going to be separated system I need to create all the tables used in processes. Here is the question, is there any possibility to use Data Services to create all non-existing tables used in data flows?
    I know that it is possible with temporary tables as DS creates them automatically (as far as i know there is no possibility to convert regular table to temporary table), but I need to create all tables, regular and temporary. Of course it would be good to be able to use some kind of automated process.

    Currently I'm working on a project in which I have to copy all the ETL processes to another repository and create separated data warehouse based on them. The repository copy itself is easy, I simply export and import it using Data Services (v. XI). Now as this is going to be separated system I need to create all the tables used in processes. Here is the question, is there any possibility to use Data Services to create all non-existing tables used in data flows?
    I know that it is possible with temporary tables as DS creates them automatically (as far as i know there is no possibility to convert regular table to temporary table), but I need to create all tables, regular and temporary. Of course it would be good to be able to use some kind of automated process.

  • Incorrect date year after import

    I like to organize my photos by date. When I import my photos through lightroom 5, I specify in the metadata to organize by date, (in which it is listed correctly) and choose the destination folder. However, once I perform the import, it creates a completely separate folder under the year "2012". How can I fix this problem to import under the correct file dates?

    How have you set the Destination options in Import? A screen capture of it would help.

  • After import, image from camera replaced with old (and incorrect) image

    i imported photos this morning.  maybe 200.  into Lightroom 5.6 (with
    camera raw 8.6, running on macosx 10.9.5; see below for "system info").  the images came from my
    Leica M Typ 240, and were in raw format.  after importing the images, i
    went through them in my first editing pass.
    going through, image 1867 ("L1001867.DNG") in LR was *not* the same as
    image 1867 on my camera, but was, rather, an image (no longer on the
    card in my camera) from 8 months ago in China.
    the image on the camera was from 07.10.2014 15:49.15 (following one
    taken at 15:49:07, which *did* show up in LR).  the *meta* data in LR
    appears correct for the image from the camera (i.e, shows a date of
    07.10.2014, has likely exposure information, etc.), but the image itself
    is the wrong image.
    the image in LR *may* have been one i deleted (in LR), though i don't
    know that for sure.  if i *did* delete it, i deleted it months ago.
    the LR image was shot on 22.02.2014 at 02:51-02:52 (i typically don't
    change the clock).  while i don't know for sure the image number of the
    image in LR, it *appears* to be image 1419.
    obviously, this is fairly worrying (as i count of LR to keep my
    inventory!).  what can i do to help you troubleshoot this?  what can you
    tell me about what is happening?
    here is what "help:system info" says:
    Lightroom version: 5.6 [974614]
    License type: Perpetual
    Operating system: Mac OS 10
    Version: 10.9 [5]
    Application architecture: x64
    Logical processor count: 4
    Processor speed: 2.9 GHz
    Built-in memory: 8192.0 MB
    Real memory available to Lightroom: 8192.0 MB
    Real memory used by Lightroom: 1673.7 MB (20.4%)
    Virtual memory used by Lightroom: 3555.8 MB
    Memory cache size: 1608.3 MB
    Maximum thread count used by Camera Raw: 2
    Displays: 1) 2560x1600
    Application folder: /Applications
    Library Path: /Users/minshall/Pictures/Lightroom/Lightroom 5 Catalog.lrcat
    Settings Folder: /Users/minshall/Library/Application Support/Adobe/Lightroom
    Installed Plugins:
    1) Behance
    2) Canon Tether Plugin
    3) Facebook
    4) Flickr
    5) Leica Tether Plugin
    6) Nikon Tether Plugin
    7) Photoshelter Archive Publisher
    Config.lua flags: None
    AudioDeviceIOBlockSize: 512
    AudioDeviceName: Built-in Output
    AudioDeviceNumberOfChannels: 2
    AudioDeviceSampleRate: 44100
    Build: Uninitialized
    CoreImage: true
    GL_ACCUM_ALPHA_BITS: 0
    GL_ACCUM_BLUE_BITS: 0
    GL_ACCUM_GREEN_BITS: 0
    GL_ACCUM_RED_BITS: 0
    GL_ALPHA_BITS: 8
    GL_BLUE_BITS: 8
    GL_DEPTH_BITS: 24
    GL_GREEN_BITS: 8
    GL_MAX_3D_TEXTURE_SIZE: 2048
    GL_MAX_TEXTURE_SIZE: 16384
    GL_MAX_TEXTURE_UNITS: 8
    GL_MAX_VIEWPORT_DIMS: 16384,16384
    GL_RED_BITS: 8
    GL_RENDERER: Intel HD Graphics 4000 OpenGL Engine
    GL_SHADING_LANGUAGE_VERSION: 1.20
    GL_STENCIL_BITS: 8
    GL_VENDOR: Intel Inc.
    GL_VERSION: 2.1 INTEL-8.28.32
    GL_EXTENSIONS: GL_ARB_color_buffer_float GL_ARB_depth_buffer_float
    GL_ARB_depth_clamp GL_ARB_depth_texture GL_ARB_draw_buffers
    GL_ARB_draw_elements_base_vertex GL_ARB_draw_instanced
    GL_ARB_fragment_program GL_ARB_fragment_program_shadow
    GL_ARB_fragment_shader GL_ARB_framebuffer_object GL_ARB_framebuffer_sRGB
    GL_ARB_half_float_pixel GL_ARB_half_float_vertex GL_ARB_instanced_arrays
    GL_ARB_multisample GL_ARB_multitexture GL_ARB_occlusion_query
    GL_ARB_pixel_buffer_object GL_ARB_point_parameters GL_ARB_point_sprite
    GL_ARB_provoking_vertex GL_ARB_seamless_cube_map GL_ARB_shader_objects
    GL_ARB_shader_texture_lod GL_ARB_shading_language_100 GL_ARB_shadow
    GL_ARB_sync GL_ARB_texture_border_clamp GL_ARB_texture_compression
    GL_ARB_texture_compression_rgtc GL_ARB_texture_cube_map
    GL_ARB_texture_env_add GL_ARB_texture_env_combine
    GL_ARB_texture_env_crossbar GL_ARB_texture_env_dot3 GL_ARB_texture_float
    GL_ARB_texture_mirrored_repeat GL_ARB_texture_non_power_of_two
    GL_ARB_texture_rectangle GL_ARB_texture_rg GL_ARB_transpose_matrix
    GL_ARB_vertex_array_bgra GL_ARB_vertex_blend GL_ARB_vertex_buffer_object
    GL_ARB_vertex_program GL_ARB_vertex_shader GL_ARB_window_pos GL_EXT_abgr
    GL_EXT_bgra GL_EXT_blend_color GL_EXT_blend_equation_separate
    GL_EXT_blend_func_separate GL_EXT_blend_minmax GL_EXT_blend_subtract
    GL_EXT_clip_volume_hint GL_EXT_debug_label GL_EXT_debug_marker
    GL_EXT_draw_buffers2 GL_EXT_draw_range_elements GL_EXT_fog_coord
    GL_EXT_framebuffer_blit GL_EXT_framebuffer_multisample
    GL_EXT_framebuffer_object GL_EXT_framebuffer_sRGB
    GL_EXT_geometry_shader4 GL_EXT_gpu_program_parameters GL_EXT_gpu_shader4
    GL_EXT_multi_draw_arrays GL_EXT_packed_depth_stencil GL_EXT_packed_float
    GL_EXT_provoking_vertex GL_EXT_rescale_normal GL_EXT_secondary_color
    GL_EXT_separate_specular_color GL_EXT_shadow_funcs
    GL_EXT_stencil_two_side GL_EXT_stencil_wrap GL_EXT_texture_array
    GL_EXT_texture_compression_dxt1 GL_EXT_texture_compression_s3tc
    GL_EXT_texture_env_add GL_EXT_texture_filter_anisotropic
    GL_EXT_texture_integer GL_EXT_texture_lod_bias GL_EXT_texture_rectangle
    GL_EXT_texture_shared_exponent GL_EXT_texture_sRGB
    GL_EXT_texture_sRGB_decode GL_EXT_timer_query GL_EXT_transform_feedback
    GL_EXT_vertex_array_bgra GL_APPLE_aux_depth_stencil
    GL_APPLE_client_storage GL_APPLE_element_array GL_APPLE_fence
    GL_APPLE_float_pixels GL_APPLE_flush_buffer_range GL_APPLE_flush_render
    GL_APPLE_object_purgeable GL_APPLE_packed_pixels GL_APPLE_pixel_buffer
    GL_APPLE_rgb_422 GL_APPLE_row_bytes GL_APPLE_specular_vector
    GL_APPLE_texture_range GL_APPLE_transform_hint
    GL_APPLE_vertex_array_object GL_APPLE_vertex_array_range
    GL_APPLE_vertex_point_size GL_APPLE_vertex_program_evaluators
    GL_APPLE_ycbcr_422 GL_ATI_separate_stencil GL_ATI_texture_env_combine3
    GL_ATI_texture_float GL_ATI_texture_mirror_once GL_IBM_rasterpos_clip
    GL_NV_blend_square GL_NV_conditional_render GL_NV_depth_clamp
    GL_NV_fog_distance GL_NV_light_max_exponent GL_NV_texgen_reflection
    GL_NV_texture_barrier GL_SGIS_generate_mipmap GL_SGIS_texture_edge_clamp
    GL_SGIS_texture_lod

    it appears the actual image is probably in place, but the *preview* is incorrect.  clicking on the image to zoom in causes (the normal) "Loading...", followed by a zoomed in view of the *correct* image.  zooming back out gives the normal view of the correct image.  (*very* good to know!)

  • I'm getting an error message after importing a table through Dreamweaver into a page developed in Muse.

    I'm getting a couple of messages after importing a table through Dreamweaver into a page developed in Muse. I have compared the code of the page with the table with the page before I inserted the table and cannot find anything missing. the web address is http://gourmetdreams.com/weekly-menu.html
    How can I clear this up?
    I also noticed that when I uploaded the file there was one error: "- error occurred - Access denied.  The file may not exist locally,  may be open in another program, or there could be a local permission problem."

    Thanks for your help! I imported the table into Muse, then exported the site as html and opened it in Dreamweaver. I should've tried that in the first place but gave up when I couldn't edit it in Muse. I didn't know you could edit the html right in Muse. Also, my styling and the images I was using weren't showing up in Muse but once I opened it in Dreamweaver, everything fell into place. AND I could edit it! My client needs to edit this page every week, so I needed to be sure she could do that. She doesn't use Muse but does use Dreamweaver. At this point, I think I'll leave it alone and not try to style it with inline css. Maybe when I get a little more time, I'll try it. I just don't want to mess things up now that the're working.

  • OBIEE changes table column attributes after importing to rpd

    Hello guys
    Something interesting is happening in our OBIEE environment. There are a couple of tables we import into physical layer, after importing these tables, all the column data length gets to set to '0' and all nullable becomes 'false'. I have changed these table in the DB and there are all correct. I have also taken a copy of this rpd and test in my own local environment with the same connection pool setting, I was able to import those same tables from the same DB and the attributes remain correct.
    However, when we import these tables again in our rpd in unix environment, it is again overiding all the column data length to 0 and nullable to false..
    Any clues on how to investigate?

    Was it when it was still Siebel Analytics?
    I see that it is a known issue, Any luck recalling the solutions so far?

  • Macbook pro wont start up after trying fsck -y (incorrect block count)

    my 2 months macbook pro won't start up after numerous attempts and ive tried fsck -y and it says "incorrect block count for file system.log (it should be 184 instead of 108) help please

    it just says start up disk full, its not full however it has at least 10Gb of space left.
    Your HD is full!  How large is your HD?  You should never let your hard drive get to where you have only 10-15% of space left.  You need to do some serious housekeeping.  However, you need to get to your desktop to do that. 
    Can you safe boot?  Read the threads over in the "More Like This" column over here.----------->

  • Incorrect colors after importing a RAW image into LR

    I shoot in RAW mode using Canon EOS 400D.
    After importing RAW images into LR, EVERY image I shot gets oversaturated with green-yellow colors. If I open same image in C1 Pro colors look fine.
    If I change shooting mode on my camera to JPG (using sRGB color space) and import image into LR, colors look same as on RAW image in C1 Pro. All settings in LR and C1 Pro are set to default.
    My guess is that LR applies wrong (camera?) profile when it imports a RAW image. I don't think it's a difference in RAW converters, because the difference in colors is really huge.
    In the Camera Calibration panel I see ACR 3.6 profile selected and all settings are set to 0. Using sliders I can correct colors on image, but I failed to make a generic preset that could be applied on all imported RAW images.
    My monitor (Samsung 940BF) is setup to use manufacturer ICC profile. I tried to change it to sRGB or Adobe RGB, it doesn't help. I'm running Windows XP SP2.
    Any thoughts on how to return colors back to normal?
    Here are 2 examples of the same image converted by:
    1. LR
    [url=http://img263.imageshack.us/my.php?image=img1258lightroomsmallcp7.jpg][img]http://img 263.imageshack.us/img263/8411/img1258lightroomsmallcp7.th.jpg[/img][/url]
    2. C1 Pro
    [url=http://img156.imageshack.us/my.php?image=img1258captureonesmallcz2.jpg][img]http://im g156.imageshack.us/img156/9339/img1258captureonesmallcz2.th.jpg[/img][/url]

    Morey,
    there is no such options as "import settings into LR" you have to put the ACR-settings-file into the "~/Library/Adobe Lightroom/Develop Presets" folder (on a Mac). I think in the end its faster to just copy the numbers manually from ACR and save the preset from within LR.
    No, I don't think that calibrating has anything to do with profiles, at least not from what I understand as a profile (like sRGB or ProPhoto profile). It is kind of changing the stored default calibration values for a given camera type (something which one of course could also call a profile). It is definitive not like doing any double profiling or so.
    @Lee Jay,
    on different forums you can find values for ACR camera calibration of different cameras, googling should reveal some. Problem is that the numbers for the calibration change with the lens and, more important, there seems to be a lot of leeway regarding the behaviour of individual cameras of the same model. I found a lot of numbers on the net which didn't work with my D70, in the end I had to calibrate it myself. But your experience might differ, so I would suggest that you just have a look at what google offers.
    One last thing: You don't need to buy a PS version which is compatible with ACR 3.7 or 4.0. You can use the 30 day trial version of PS for this. Afterall you most probably only need to calibrate each camera once and I doubt that Adobe disapproves this use of a trial version as long as it helps to sell LR. ;-)
    Bye,
    Carsten

  • Changing default route after import route-target

    Hi there,
    Before I import route-target, the default route is set to 192.168.0.22 . After import the vrf, suddently it change to another PE, which is 192.168.0.19 . How do I force the default route to use 192.168.0.22 ?
    before adding route-target import 4000:1
    PE#sh ip route vrf customer 0.0.0.0
    Routing entry for 0.0.0.0/0, supernet
    Known via "bgp 100", distance 200, metric 0, candidate default path,
    type internal
    Last update from 192.168.0.22 00:14:08 ago
    Routing Descriptor Blocks:
    * 192.168.0.22 (Default-IP-Routing-Table), from 192.168.0.3, 00:14:08 ago
    Route metric is 0, traffic share count is 1
    AS Hops 0
    PE#sh ip bgp vpnv4 vrf customer 0.0.0.0
    BGP routing table entry for 100:239:0.0.0.0/0, version 335256
    Paths: (2 available, best #2, table customer)
    Not advertised to any peer
    Local
    192.168.0.22 (metric 4) from 192.168.0.45 (192.168.0.45)
    Origin incomplete, metric 0, localpref 100, valid, internal
    Extended Community: RT:100:120
    Originator: 192.168.0.50, Cluster list: 192.168.0.45
    Local
    192.168.0.22 (metric 4) from 192.168.0.3 (192.168.0.3)
    Origin incomplete, metric 0, localpref 100, valid, internal, best
    Extended Community: RT:100:120
    Originator: 192.168.0.50, Cluster list: 192.168.0.3
    after adding route-target import 4000:1
    PE#sh ip route vrf customer 0.0.0.0
    Routing entry for 0.0.0.0/0, supernet
    Known via "bgp 100", distance 200, metric 0, candidate default path,
    type internal
    Last update from 192.168.0.19 00:00:09 ago
    Routing Descriptor Blocks:
    * 192.168.0.19 (Default-IP-Routing-Table), from 192.168.0.3, 00:00:09 ago
    Route metric is 0, traffic share count is 1
    AS Hops 0
    PE#sh ip bgp vpnv4 vrf customer 0.0.0.0
    BGP routing table entry for 100:239:0.0.0.0/0, version 335386
    Paths: (3 available, best #1, table customer)
    Flag: 0x1820
    Not advertised to any peer
    Local, imported path from 4000:1:0.0.0.0/0
    192.168.0.19 (metric 2) from 192.168.0.3 (192.168.0.3)
    Origin incomplete, metric 0, localpref 100, valid, internal, best
    Extended Community: RT:4000:1
    Originator: 192.168.0.19, Cluster list: 192.168.0.3
    Local
    192.168.0.22 (metric 4) from 192.168.0.45 (192.168.0.45)
    Origin incomplete, metric 0, localpref 100, valid, internal
    Extended Community: RT:100:120
    Originator: 192.168.0.50, Cluster list: 192.168.0.45
    Local
    192.168.0.22 (metric 4) from 192.168.0.3 (192.168.0.3)
    Origin incomplete, metric 0, localpref 100, valid, internal
    Extended Community: RT:100:120
    Originator: 192.168.0.50, Cluster list: 192.168.0.3
    thanks in advance.
    maher

    Maher,
    Here's an example:
    router bgp xx
    address-family vpnv4
    nei x.x.x.x route-map localpref in
    ip extcommunity 1 permit rt 4000:1
    route-map localpref permit 10
    match extcommunity 1
    set local-preference 110
    route-map localpref permit 20
    BTW: if the route with RT 4000:1 had a different RD both routes would get imported in the VRF and you could set the local-pref using an import map instead of an inbound route-map on the VPNv4 session.
    Hope this helps,

  • Unlogged Missing Photos After Import From Aperture

    Hi!
    I have just made the switch from Aperture to Lightroom, and have use the 1.1 version of the Aperture import plugin.
    In my Aperture Library I have, according to the Library -> Photos: 11105 Photos, however after importing to Lightroom, I have only 10967 photos. I have checked the import log, and there were 4 items which failed to import - 3 were .mpo files (panoramas from an xPeria) and 1 was a .gif file. This leaves a deficit of 133 photos that I can't account for.
    Is there any way to compare the aperture library to the lightroom library to see what is missing?

    *WARNING* Once agin, this is a VERY long post! And this contains not only SQL, but heaps of command line fun!
    TLDR Summary: Aperture is storing duplicates on disk (and referencing them in the DB) but hiding them in the GUI. Exactly how it does this, I'm not sure yet. And how to clean it up, I'm not sure either. But if you would like to know how I proved it, read on!
    An update on handling metadata exported from Aperture. Once you have a file, if you try to view it in the terminal, perhaps like this:
    $ less ApertureMetadataExtendedExport.txt
    "ApertureMetadataExtendedExport.txt" may be a binary file.  See it anyway?
    you will get that error. Turns out I was wrong, it's not (only?) due to the size of the file / line length; it's actually the file type Aperture creates:
    $ file ApertureMetadataExtendedExport.txt
    ApertureMetadataExtendedExport.txt: Little-endian UTF-16 Unicode text, with very long lines
    The key bit being "Little-endian UTF-16", that is what is causing the shell to think it's binary. The little endian is not surprising, after all it's an X86_64 platform. The UTF-16 though is not able to be handled by the shell. So it has to be converted. There are command line utils, but Text Wrangler does the job nicely.
    After conversion (to Unicode UTF-8):
    $ file ApertureMetadataExtendedExport.txt
    ApertureMetadataExtendedExport.txt: ASCII text, with very long lines
    and
    $ less ApertureMetadataExtendedExport.txt
    Version Name    Title   Urgency Categories      Suppl. Categories       Keywords        Instructions    Date Created    Contact Creator Contact Job Title       City    State/Province  Country Job Identifier  Headline        Provider        Source  Copyright Notice        Caption Caption Writer  Rating  IPTC Subject Code       Usage Terms     Intellectual Genre      IPTC Scene      Location        ISO Country Code        Contact Address Contact City    Contact State/Providence        Contact Postal Code     Contact Country Contact Phone   Contact Email   Contact Website Label   Latitude        Longitude       Altitude        AltitudeRef
    So, there you have it! That's what you have access to when exporting the metadata. Helpful? Well, at first glance I didn't think so - as the "Version Name" field is just "IMG_2104", no extension, no path etc. So if we have multiple images called "IMG_2104" we can't tell them apart (unless you have a few other fields to look at - and even then just comparing to the File System entries wouldn't be possible). But! In my last post, I mentioned that the Aperture SQLite DB (Library.apdb, the RKMasters table in particular) contained 11130 entries, and if you looked at the Schema, you would have noticed that there was a column called "originalVersionName" which should match! So, in theory, I can now create a small script to compare metadata with database and find my missing 25 files!
    First of all, I need to add that, when exporting metadata in Aperture, you need to select all the photos! ... and it will take some time! In my case TextWrangler managed to handle the 11108 line file without any problems. And even better, after converting, I was able to view the file with less. This is a BIG step on my last attempt.
    At this point it is worth pointing out that the file is tab-delimited (csv would be easier, of course) but we should be able to work with it anyway.
    To extract the version name (first column) we can use awk:
    $ cat ApertureMetadataExtendedExport.txt | awk -F'\t' '{print $1}' > ApertureMetadataVersionNames.txt
    and we can compare the line counts of both input and output to ensure we got everything:
    $ wc -l ApertureMetadataExtendedExport.txt
       11106 ApertureMetadataExtendedExport.txt
    $ wc -l ApertureMetadataVersionNames.txt
       11106 ApertureMetadataVersionNames.txt
    So far, so good! You might have noticed that the line count is 11106, not 11105, the input file has the header as I printed earlier. So we need to remove the first line. I just use vi for that.
    Lastly, the file needs to be sorted, so we can ensure we are looking in the same order when comparing the metadata version names with the DB version names.
    $ cat ApertureMetadataVersionNames.txt | sort > ApertureMetadataVersionNamesSorted.txt
    To get the Version Names from the DB, fire up sqlite3:
    $ sqlite3 Library.apdb
    sqlite> .output ApertureDBMasterVersionNames.txt
    sqlite> select originalVersionName from RKMaster;
    sqlite> .exit
    Checking the line count in the DB Output:
    $ wc -l ApertureDBMasterVersionNames.txt
       11130 ApertureDBMasterVersionNames.txt
    Brilliant! 11130 lines as expected. Then sort as we did before:
    $ cat ApertureDBMasterVersionNames.txt | sort > ApertureDBMasterVersionNamesSorted.txt
    So, now, in theory, running a diff on both files, should reveal the 25 missing files.... I must admit, I'm rather excited at this point!
    $ diff ApertureDBMasterVersionNamesSorted.txt ApertureMetadataVersionNamesSorted.txt
    IT WORKED! The output is a list of changes you need to make to the second input file to make it look the same as the first. Essentially, this will (in my case) show the Version Names that are missing in Aperture that are present on the File System.
    So, a line like this:
    1280,1281d1279
    < IMG_0144
    < IMG_0144
    basically just means, that there are IMG_0144 appears twice more in the DB than in the Metadata. Note: this is specific for the way I ordered the input files to diff; although you will get the same basic output if you reversed the input files to diff, the interpretation is obviously reversed) as shown here: (note in the first output, we have 'd' for deleted, and in the second output it's 'a' for added)
    1279a1280,1281
    > IMG_0144
    > IMG_0144
    In anycase, looking through my output and counting, I indeed have 25 images to investigate. The problem here is we just have a version name, fortunately in my output, most are unique with just a couple of duplicates. This leads me to believe that my "missing" files are actually Aperture handling duplicates (though why it's hiding them I'm not sure). I could, in my DB dump look at the path etc as well and that might help, but as it's just 25 cases, I will instead get a FS dump, and grep for the version name. This will give me all the files on the FS that match. I can then look at each and see what's happening.
    Dumping a list of master files from the FS: (execute from within the Masters directory of your Aperture library)
    $ find . -type f > ApertureFSMasters.txt
    This will be a list including path (relative to Master) which is exactly what we want. Then grep for each version name. For example:
    $ grep IMG_0144 ApertureFSMasters.txt
    ./2014/04/11/20140411-222634/IMG_0144.JPG
    ./2014/04/23/20140423-070845/IMG_0144 (1).jpg
    ./2014/04/23/20140423-070845/IMG_0144.jpg
    ./2014/06/28/20140628-215220/IMG_0144.JPG
    Here is a solid bit of information! On the FS i have 4 files called IMG_0144, yet if I look in the GUI (or metadata dump) I only have 2.
    $ grep IMG_0144 ApertureMetadataVersionNamesSorted.txt
    IMG_0144
    IMG_0144
    So, there is two files already!
    The path preceding the image in the FS dump, is the date of import. So I can see that two were imported at the same time, and two separately. The two that show up in the GUI have import sessions of 2014-06-28 @ 09:52:20 PM and 2014-04-11 @ 10:26:34 PM. That means that the first and last are the two files that show in the GUI, the middle two do not.... Why are they not in the GUI (yet are in the DB) and why do they have the exact same import date/time? I have no answer to that yet!
    I used open <filename> from the terminal prompt to view each file, and 3 out of my 4 are identical, and the fourth different.
    So, lastly, with a little command line fu, we can make a useful script to tell us what we want to know:
    #! /bin/bash
    grep $1 ApertureFSMasters.txt | sed 's|\.|Masters|' | awk '{print "<full path to Aperture Library folder>"$0}' | \
    while read line; do
      openssl sha1 "$line"
    done
    replace the <full path to Aperture Library folder> with the full path to you Aperture Library Folder, perhaps /volumes/some_disk_name/some_username/Pictures/.... etc. Then chmod 755 the script, and execute ./<scriptname> <version name> so something like
    $ ./calculateSHA.sh IMG_0144
    What we're doing here is taking in the version name we want to find (for example IMG_0144), and we are looking for it in the FS dump list. Remember that file contains image files relative to the Aperture Library Master path, which look something like "./YYYY/MM/DD/YYYYMMDD-HHMMSS/<FILENAME>" - we use sed to replace the "./" part with "Masters". Then we pipe it to awk, and insert the full path to aperture before the file name, the end result is a line which contains the absolute path to an image. There are several other ways to solve this, such as generating the FS dump from the root dir. You could also combine the awk into the sed (or the sed into the awk).. but this works. Each line is then passed, one at a time, to the openssl program to calculate the sha-1 checksum for that image. If a SHA-1 matches, then those files are identical (yes, there is a small chance of a collision in SHA-1, but it's unlikely!).
    So, at the end of all this, you can see exactly whats going on. And in my case, Aperture is storing duplicates on disk, and not showing them in the GUI. To be honest, I don't actually know how to clean this up now! So if anyone has any ideas. Please let me know I can't just delete the files on disk, as they are referenced in the DB. I guess it doesn't make too much difference, but my personality requires me to clean this up (at the very least to provide closure on this thread).
    The final point to make here is that, since Lightroom also has 11126 images (11130 less 4 non-compatible files). Then it has taken all the duplicates in the import.
    Well, that was a fun journey, and I learned a lot about Aperture in the process. And yes, I know this is a Lightroom forum and maybe this info would be better on the Aperture forum, I will probably update it there too. But there is some tie back to the Lightroom importer to let people know whats happening internally. (I guess I should update my earlier post, where I assumed the Lightroom Aperture import plugin was using the FS only, it *could* be using the DB as well (and probably is, so it can get more metadata))
    UPDATE: I jumped the gun a bit here, and based my conclusion on limited data. I have finished calculating the SHA-1 for all my missing versions. As well as comparing the counts in the GUI, to the counts in the FS. For the most part, where the GUI count is lower than the FS count, there is a clear duplicate (two files with the same SHA-1). However I have a few cases, where the FS count is higher, and all the images on disk have different SHA-1's! Picking one at random from my list; I have 3 images in the GUI called IMG_0843. On disk I have 4 files all with different SHA-1's. Viewing the actual images, 2 look the same, and the other 2 are different. So that matches 3 "unique" images.
    Using Preview to inspect the exif data for the images which look the same:
    image 1:
    Pixel X Dimension: 1 536
    Pixel Y Dimension: 2 048
    image 2:
    Pixel X Dimension: 3 264
    Pixel Y Dimension: 2 448
    (image 2 also has an extra Regions dictionary in the exit)
    So! These two images are not identical (we knew that from the SHA-1), but they are similar (content is the same - resolution is the same) yet Aperture is treating these as duplicates it seems.. that's not good! does this mean that if I resize an image for the web, and keep both, that Aperture won't show me both? (at least it keeps both on disk though, I guess...)
    The resolution of image 1, is suspiciously like the resolutions that were uploaded to (the original version of) iCloud Photos on the iPhone (one of the reasons I never used it). And indeed, the photo I chose at random here, is one that I have in an iCloud stored album (I have created a screensaver synced to iCloud, to use on my various Mac's and AppleTVs). Examining the data for the cloud version of the image, shows the resolution to be 1536x2048. The screensaver contains 22 images - I theorised earlier that these might be the missing images, perhaps I was right after all? Yet another avenue to explore.
    Ok. I dumped the screensaver metadata, converted it to UTF-8, grabbed the version names, and sorted them (just like before). Then compared them to the output of the diff command. Yep! the 22 screensaver images match to 22 / 25 missing images. The other 3, appear to be exact duplicates (same SHA-1) of images already in the library. That almost solves it! So then, can I conclude that Lightroom has imported my iCloud Screensaver as normal photos of lower res? In which case, it would likely do it for any shared photo source in Aperture, and perhaps it would be wise to turn that feature off before importing to Lightroom?

  • Apex Application not working after importing to my apps schema...

    Hi friends,
    I created one DB application in my sample schema that is associated with apex 4.0.....
    Application details:
    *) login page(where i will be giving username and password)
    *) page 1 (consist of several form fields, like
    --->name:
    --->module:
    --->projects:
    ----->email:
    the above are the fields, if i put any entries in the above field means, it will get automatically inserting in the report column, which is also in the same page consist of the above fields in the table manner...
    since this report table consist of an edit icon in front, if i clicked the edit icon of an one row means, it will go to another page..
    *) i.e. page 3(consist of the same fields with entries in it automatically corresponding to the each and every row, suppose if i want to update any changes means i can update in it....
    this is my application that i developed it is working well with in the sample schema apex 4.0..
    What i did is i created a new workspace with APPS schema in it...and i have imported my application that i developed in sample schema to APPS schema....
    Since after importing to APPS schema...when i tried to open the application, it is not showing any datas in it...That is due to the tables that are supporting the application is not in APPS schema, so what i did is i have given grant privilege to the respective tables and also i have created a synonym for accessing the table in APPS schema for supporting the application.....
    Now if i tried to put any entries in the form in page 2, it is getting inserting in the report column which is also in the page2....
    But my problem starts here, if i clicked the edit icon symbol in each and every row of the report column it is going to the page 3 which has a respective form fields, but it is not showing any entries in it automatically, and if i tried to put any entry in it, it is not getting updating in the report table......
    why this problem occurred for my application in APPS schema....But my application works very well within the sample schema......why it is not showing any entries in the form automatically soon after i clicked the edit icon in each and every row.....
    i couldn't know what is the real problem behind this..... help me friends.
    As this is my urgent requirement in my project..Reply me ASAP...
    Thanks in Advance..
    Regards,
    Harry...

    First, try a system reset although I can't give you any confidence.  It cures many ills and it's quick, easy and harmless...
    Hold down the on/off switch and the Home button simultaneously until you see the Apple logo.  Ignore the "Slide to power off" text if it appears.  You will not lose any apps, data, music, movies, settings, etc.
    If the Reset doesn't work, try a Restore.  Note that it's nowhere near as quick as a Reset.  It could take well over an hour!  Connect via cable to the computer that you use for sync.  From iTunes, select the iPad/iPod and then select the Summary tab.  Follow directions for Restore and be sure to say "yes" to the backup.  You will be warned that all data (apps, music, movies, etc.) will be erased but, as the Restore finishes, you will be asked if you wish the contents of the backup to be copied to the iPad/iPod.  Again, say "yes."
    At the end of the basic Restore, you will be asked if you wish to sync the iPad/iPod.  As before, say "yes."  Note that that sync selection will disappear and the Restore will end if you do not respond within a reasonable time.  If that happens, only the apps that are part of the IOS will appear on your device.  Corrective action is simple -  choose manual "Sync" from the bottom right of iTunes.
    If you're unable to do the Restore, go into Recovery Mode per the instructions here.

  • Appset not appearing after import

    Hi Friends,
    We are on bpc75nw sp04.
    Custom appset(A) is correctly imported into target system. I'm able to see in backend(BW), but in fron-end(BPC admin/BPC excel) not appearing. Could any one suggest, what might be reason? How to know install user id of BPC system?
    Basis consultant, tried with different user ids like install user, & other users, but he couldn't see at server level as well client level.
    I checked uje_user table, it shows appset(A) avialable to  USER1.
    I hope with USER1 id only, we are able to see appset (A).
    Regards,
    Naresh

    Log:  import ended with warning.
       Start of the after-import method RS_APPS_AFTER_IMPORT for object type(s) APPS (Activation Mode)
       Start After Import for AppSet XXXX in Client 500 for RFC MDX PARSER
       Import Step UPDPTAB completed without errors
       Import Step ADMIN_DEF_UPD completed without errors
       Import Step APPS_ADD completed without errors
       After Import method for AppSet XXXX finished successfully
       Start of data checker messages
       The file service structure is correct.
    Dimension ACCOUNT's master is empty!
    Dimension ACCT's master is empty!
    Dimension CATEGORY's master is empty!
    Dimension CHANNEL's master is empty!
    Dimension COST_COMP's master is empty!
    Dimension C_ACCT's master is empty!
    Dimension C_CATEGORY's master is empty!
    Dimension TIME's master is empty!
       BPF: Validation error; No template access is defined for template "Sales Flow"
       BPF: Validation error; member item "REGION 2" is not in drive dimension "LOCATION"
       BPF: Validation error; hierarchy "PARENTH1" is not in drive dimension "LOCATION"
       BPF: Validation error; member item "REGION 1" is not in drive dimension "LOCATION"
       BPF: Validation error; hierarchy "PARENTH1" is not in drive dimension "LOCATION"
       BPF: Validation error; member item "REGION 2" is not in drive dimension "LOCATION"
       BPF: Validation error; hierarchy "PARENTH1" is not in drive dimension "LOCATION"
       BPF: Validation error; member item "REGION 1" is not in drive dimension "LOCATION"
       BPF: Validation error; hierarchy "PARENTH1" is not in drive dimension "LOCATION"
       BPF: Validation error; member item "INDIA" is not in drive dimension "LOCATION"
       BPF: Validation error; No template access is defined for template "SALES_PLANNING FLOW"
    End of data checker messages
       End of after import methode RS_APPS_AFTER_IMPORT (Activation Mode) - runtime: 00:19:35
       Start of the after-import method RS_APPS_AFTER_IMPORT for object type(s) APPS (Delete Mode)
       Nothing to delete.
       End of after import methode RS_APPS_AFTER_IMPORT (Delete Mode) - runtime: 00:00:00
       Post-import method RS_AFTER_IMPORT completed for APPS L, date and time: 20110207110512
       Post-import methods of change/transport request BQ1K900069 completed
            Start of subsequent processing ... 20110207104537
            End of subsequent processing... 20110207110512
       Execute reports for change/transport request: BQ1K900069
          on the application server: sparbdb
        Ended with return code:  ===> 4 <===

  • System Variable For Internal table count

    Can anybody tell me what the system variable for internal table count?
    I just wants to know how many recoreds are there in my internal table ?
    Regards,
    pandu.

    Hi ,
    DESCRIBE TABLE <itab> [LINES <l>] [OCCURS <n>] [KIND <k>].
    If you use the LINES parameter, the number of filled lines is written to the variable <lin>. If parameter, the you use the OCCURS value of the INITIAL SIZE of the table is returned to the variable <n>. If you use the KIND parameter, the table type is returned to the variable <k>: ‘T’ for standard table, ‘S’ for sorted table, and ‘H’ for hashed table.
    using variable
    SY-TFILL
    After the statements DESCRIBE TABLE, LOOP AT, and READ TABLE, SY-TFILL contains the number of lines in the relevant internal table.
    http://help.sap.com/saphelp_nw04/helpdata/en/fc/eb3798358411d1829f0000e829fbfe/content.htm
    <REMOVED BY MODERATOR>
    Edited by: Alvaro Tejada Galindo on Feb 21, 2008 4:53 PM

  • Table count comparison in a SSIS package

    In our production environment, I have a SSIS package to import from OLTP SQL Server database to Data Warehouse (in SQL Server) and from there another package imports from Data warehouse to a Tabular SSAS database.
    For health check reason, I would like to develop a SSIS package to compare table counts between production OLTP, Data warehouse and Tabular databases. I know how to do it for SQL Server databases but have no idea how to calculate table counts of Tabular tables
    and save the results to a SQL Server table to compare the counts.
    Has any body done this? Any clue?
    Thanks

    one option is to query the a ssas DMV in a DataFlow...
    SELECT *
    FROM $SYSTEM.DBSCHEMA_TABLES
    WHERE TABLE_CATALOG = 'AdventureWorks Tabular Model SQL 2012' AND
    TABLE_TYPE = 'TABLE' AND
    TABLE_SCHEMA = 'Model'
    ...and capture the RowCount to a variable.
    BI Developer and lover of data (Blog |
    Twitter)

Maybe you are looking for

  • Regarding Horizon period in Dynamic Credit Check

    Hi All, i am using a dynamic credit management with horizon as 1 month, suppose consider todays date ( 11th May), as per the configuration, the system should block the documents that has Material availability date till 10th or 11th June, but am getti

  • Help in printing ALV grid

    Hi! I created an ALV grid. But the header and the footer part are not seen in the print preview, in which i think won't also be printed. What will I do for it to be printed? Thanks.

  • IDVD burn quality

    The slideshow image quality is better in my iDVD previews than it is in the iDVDs that I burn. I've set the quality to the highest (professional quality). Is there something else to do to improve the image quality on the burned DVD?

  • "Please select a calendar that supports tasks" How?

    All the Task related buttons are grayed out and when I try to do anything related to tasks it says "Please select a calendar that supports tasks. But I cannot find the way to do this anywhere.

  • Mavericks Won't Finish Download

    I've been trying to get OS X Mavericks to download on my 2013 iMac. When I go to the purchases page, it says it has 8 minutes remaining but that it failed to download. It's not on my launchpad at all and I can't figure out how to get it to finish dow